Feb 13 15:56:53.165304 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 14:06:02 -00 2025 Feb 13 15:56:53.165359 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=85b856728ac62eb775b23688185fbd191f36059b11eac7a7eacb2da5f3555b05 Feb 13 15:56:53.165589 kernel: BIOS-provided physical RAM map: Feb 13 15:56:53.165606 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Feb 13 15:56:53.165618 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Feb 13 15:56:53.165629 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Feb 13 15:56:53.165661 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable Feb 13 15:56:53.165672 kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved Feb 13 15:56:53.165683 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Feb 13 15:56:53.165695 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Feb 13 15:56:53.165715 kernel: NX (Execute Disable) protection: active Feb 13 15:56:53.165726 kernel: APIC: Static calls initialized Feb 13 15:56:53.165745 kernel: SMBIOS 2.8 present. Feb 13 15:56:53.165757 kernel: DMI: DigitalOcean Droplet/Droplet, BIOS 20171212 12/12/2017 Feb 13 15:56:53.165769 kernel: Hypervisor detected: KVM Feb 13 15:56:53.165781 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Feb 13 15:56:53.165802 kernel: kvm-clock: using sched offset of 4286711574 cycles Feb 13 15:56:53.165815 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Feb 13 15:56:53.165827 kernel: tsc: Detected 1995.312 MHz processor Feb 13 15:56:53.165839 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 15:56:53.165852 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 15:56:53.165865 kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000 Feb 13 15:56:53.165877 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Feb 13 15:56:53.165891 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 15:56:53.165943 kernel: ACPI: Early table checksum verification disabled Feb 13 15:56:53.165957 kernel: ACPI: RSDP 0x00000000000F5950 000014 (v00 BOCHS ) Feb 13 15:56:53.165970 kernel: ACPI: RSDT 0x000000007FFE1986 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 15:56:53.165988 kernel: ACPI: FACP 0x000000007FFE176A 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 15:56:53.166002 kernel: ACPI: DSDT 0x000000007FFE0040 00172A (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 15:56:53.166015 kernel: ACPI: FACS 0x000000007FFE0000 000040 Feb 13 15:56:53.166028 kernel: ACPI: APIC 0x000000007FFE17DE 000080 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 15:56:53.166040 kernel: ACPI: HPET 0x000000007FFE185E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 15:56:53.166052 kernel: ACPI: SRAT 0x000000007FFE1896 0000C8 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 15:56:53.166072 kernel: ACPI: WAET 0x000000007FFE195E 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 15:56:53.166084 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe176a-0x7ffe17dd] Feb 13 15:56:53.166096 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffe0040-0x7ffe1769] Feb 13 15:56:53.166107 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffe0000-0x7ffe003f] Feb 13 15:56:53.166141 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe17de-0x7ffe185d] Feb 13 15:56:53.166153 kernel: ACPI: Reserving HPET table memory at [mem 0x7ffe185e-0x7ffe1895] Feb 13 15:56:53.166166 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe1896-0x7ffe195d] Feb 13 15:56:53.166187 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe195e-0x7ffe1985] Feb 13 15:56:53.166202 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 15:56:53.166214 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Feb 13 15:56:53.166228 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 15:56:53.166240 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 15:56:53.166261 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdafff] -> [mem 0x00000000-0x7ffdafff] Feb 13 15:56:53.166274 kernel: NODE_DATA(0) allocated [mem 0x7ffd5000-0x7ffdafff] Feb 13 15:56:53.166292 kernel: Zone ranges: Feb 13 15:56:53.166306 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 15:56:53.166321 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdafff] Feb 13 15:56:53.166334 kernel: Normal empty Feb 13 15:56:53.166346 kernel: Movable zone start for each node Feb 13 15:56:53.166358 kernel: Early memory node ranges Feb 13 15:56:53.166370 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Feb 13 15:56:53.166381 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdafff] Feb 13 15:56:53.166394 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdafff] Feb 13 15:56:53.166412 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 15:56:53.166425 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Feb 13 15:56:53.166445 kernel: On node 0, zone DMA32: 37 pages in unavailable ranges Feb 13 15:56:53.166459 kernel: ACPI: PM-Timer IO Port: 0x608 Feb 13 15:56:53.166474 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Feb 13 15:56:53.166488 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Feb 13 15:56:53.166499 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 13 15:56:53.166510 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Feb 13 15:56:53.166523 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 15:56:53.166710 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Feb 13 15:56:53.166740 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Feb 13 15:56:53.166749 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 15:56:53.166762 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 13 15:56:53.166771 kernel: TSC deadline timer available Feb 13 15:56:53.166779 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Feb 13 15:56:53.166788 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Feb 13 15:56:53.166796 kernel: [mem 0x80000000-0xfeffbfff] available for PCI devices Feb 13 15:56:53.166810 kernel: Booting paravirtualized kernel on KVM Feb 13 15:56:53.166818 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 15:56:53.166831 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Feb 13 15:56:53.166839 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Feb 13 15:56:53.166848 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Feb 13 15:56:53.166856 kernel: pcpu-alloc: [0] 0 1 Feb 13 15:56:53.166864 kernel: kvm-guest: PV spinlocks disabled, no host support Feb 13 15:56:53.166874 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=85b856728ac62eb775b23688185fbd191f36059b11eac7a7eacb2da5f3555b05 Feb 13 15:56:53.166884 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 15:56:53.166895 kernel: random: crng init done Feb 13 15:56:53.166908 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 15:56:53.166916 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 15:56:53.166925 kernel: Fallback order for Node 0: 0 Feb 13 15:56:53.166938 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515803 Feb 13 15:56:53.166946 kernel: Policy zone: DMA32 Feb 13 15:56:53.166955 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 15:56:53.166963 kernel: Memory: 1969152K/2096612K available (14336K kernel code, 2299K rwdata, 22800K rodata, 43320K init, 1756K bss, 127200K reserved, 0K cma-reserved) Feb 13 15:56:53.166972 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Feb 13 15:56:53.166980 kernel: Kernel/User page tables isolation: enabled Feb 13 15:56:53.166991 kernel: ftrace: allocating 37890 entries in 149 pages Feb 13 15:56:53.166999 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 15:56:53.167007 kernel: Dynamic Preempt: voluntary Feb 13 15:56:53.167021 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 15:56:53.167031 kernel: rcu: RCU event tracing is enabled. Feb 13 15:56:53.167039 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Feb 13 15:56:53.167048 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 15:56:53.167056 kernel: Rude variant of Tasks RCU enabled. Feb 13 15:56:53.167064 kernel: Tracing variant of Tasks RCU enabled. Feb 13 15:56:53.167076 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 15:56:53.167085 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Feb 13 15:56:53.167093 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Feb 13 15:56:53.167104 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Feb 13 15:56:53.174185 kernel: Console: colour VGA+ 80x25 Feb 13 15:56:53.174249 kernel: printk: console [tty0] enabled Feb 13 15:56:53.174263 kernel: printk: console [ttyS0] enabled Feb 13 15:56:53.174276 kernel: ACPI: Core revision 20230628 Feb 13 15:56:53.174289 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Feb 13 15:56:53.174313 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 15:56:53.174329 kernel: x2apic enabled Feb 13 15:56:53.174341 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 15:56:53.174353 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 15:56:53.174366 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3985c314e25, max_idle_ns: 881590612270 ns Feb 13 15:56:53.174379 kernel: Calibrating delay loop (skipped) preset value.. 3990.62 BogoMIPS (lpj=1995312) Feb 13 15:56:53.174392 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Feb 13 15:56:53.174405 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Feb 13 15:56:53.174434 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 15:56:53.174447 kernel: Spectre V2 : Mitigation: Retpolines Feb 13 15:56:53.174459 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 15:56:53.174475 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Feb 13 15:56:53.174488 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Feb 13 15:56:53.174502 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 15:56:53.174515 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 15:56:53.174528 kernel: MDS: Mitigation: Clear CPU buffers Feb 13 15:56:53.174542 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 15:56:53.174570 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 15:56:53.174583 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 15:56:53.174595 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 15:56:53.174608 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 15:56:53.174621 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Feb 13 15:56:53.174635 kernel: Freeing SMP alternatives memory: 32K Feb 13 15:56:53.174648 kernel: pid_max: default: 32768 minimum: 301 Feb 13 15:56:53.174660 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 15:56:53.174690 kernel: landlock: Up and running. Feb 13 15:56:53.174707 kernel: SELinux: Initializing. Feb 13 15:56:53.174720 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 15:56:53.174734 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 15:56:53.174748 kernel: smpboot: CPU0: Intel DO-Regular (family: 0x6, model: 0x4f, stepping: 0x1) Feb 13 15:56:53.174763 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 15:56:53.174779 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 15:56:53.174793 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 15:56:53.174806 kernel: Performance Events: unsupported p6 CPU model 79 no PMU driver, software events only. Feb 13 15:56:53.174827 kernel: signal: max sigframe size: 1776 Feb 13 15:56:53.174866 kernel: rcu: Hierarchical SRCU implementation. Feb 13 15:56:53.174883 kernel: rcu: Max phase no-delay instances is 400. Feb 13 15:56:53.174896 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 15:56:53.174910 kernel: smp: Bringing up secondary CPUs ... Feb 13 15:56:53.174921 kernel: smpboot: x86: Booting SMP configuration: Feb 13 15:56:53.174930 kernel: .... node #0, CPUs: #1 Feb 13 15:56:53.174940 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 15:56:53.174956 kernel: smpboot: Max logical packages: 1 Feb 13 15:56:53.174970 kernel: smpboot: Total of 2 processors activated (7981.24 BogoMIPS) Feb 13 15:56:53.174980 kernel: devtmpfs: initialized Feb 13 15:56:53.174989 kernel: x86/mm: Memory block size: 128MB Feb 13 15:56:53.174998 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 15:56:53.175008 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Feb 13 15:56:53.175017 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 15:56:53.175026 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 15:56:53.175035 kernel: audit: initializing netlink subsys (disabled) Feb 13 15:56:53.175044 kernel: audit: type=2000 audit(1739462211.267:1): state=initialized audit_enabled=0 res=1 Feb 13 15:56:53.175057 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 15:56:53.175066 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 15:56:53.175074 kernel: cpuidle: using governor menu Feb 13 15:56:53.175083 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 15:56:53.175092 kernel: dca service started, version 1.12.1 Feb 13 15:56:53.175102 kernel: PCI: Using configuration type 1 for base access Feb 13 15:56:53.175111 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 15:56:53.175156 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 15:56:53.175165 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 15:56:53.175177 kernel: ACPI: Added _OSI(Module Device) Feb 13 15:56:53.175186 kernel: ACPI: Added _OSI(Processor Device) Feb 13 15:56:53.175195 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 15:56:53.175204 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 15:56:53.175213 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 15:56:53.175222 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 15:56:53.175231 kernel: ACPI: Interpreter enabled Feb 13 15:56:53.175240 kernel: ACPI: PM: (supports S0 S5) Feb 13 15:56:53.175249 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 15:56:53.175257 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 15:56:53.175271 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 15:56:53.175285 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Feb 13 15:56:53.175299 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Feb 13 15:56:53.175657 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Feb 13 15:56:53.175831 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Feb 13 15:56:53.175937 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Feb 13 15:56:53.175956 kernel: acpiphp: Slot [3] registered Feb 13 15:56:53.175971 kernel: acpiphp: Slot [4] registered Feb 13 15:56:53.175985 kernel: acpiphp: Slot [5] registered Feb 13 15:56:53.175999 kernel: acpiphp: Slot [6] registered Feb 13 15:56:53.176007 kernel: acpiphp: Slot [7] registered Feb 13 15:56:53.176017 kernel: acpiphp: Slot [8] registered Feb 13 15:56:53.176025 kernel: acpiphp: Slot [9] registered Feb 13 15:56:53.176034 kernel: acpiphp: Slot [10] registered Feb 13 15:56:53.176042 kernel: acpiphp: Slot [11] registered Feb 13 15:56:53.176051 kernel: acpiphp: Slot [12] registered Feb 13 15:56:53.176063 kernel: acpiphp: Slot [13] registered Feb 13 15:56:53.176072 kernel: acpiphp: Slot [14] registered Feb 13 15:56:53.176080 kernel: acpiphp: Slot [15] registered Feb 13 15:56:53.176089 kernel: acpiphp: Slot [16] registered Feb 13 15:56:53.176098 kernel: acpiphp: Slot [17] registered Feb 13 15:56:53.176107 kernel: acpiphp: Slot [18] registered Feb 13 15:56:53.178166 kernel: acpiphp: Slot [19] registered Feb 13 15:56:53.178205 kernel: acpiphp: Slot [20] registered Feb 13 15:56:53.178214 kernel: acpiphp: Slot [21] registered Feb 13 15:56:53.178232 kernel: acpiphp: Slot [22] registered Feb 13 15:56:53.178245 kernel: acpiphp: Slot [23] registered Feb 13 15:56:53.178260 kernel: acpiphp: Slot [24] registered Feb 13 15:56:53.178268 kernel: acpiphp: Slot [25] registered Feb 13 15:56:53.178277 kernel: acpiphp: Slot [26] registered Feb 13 15:56:53.178286 kernel: acpiphp: Slot [27] registered Feb 13 15:56:53.178294 kernel: acpiphp: Slot [28] registered Feb 13 15:56:53.178308 kernel: acpiphp: Slot [29] registered Feb 13 15:56:53.178322 kernel: acpiphp: Slot [30] registered Feb 13 15:56:53.178333 kernel: acpiphp: Slot [31] registered Feb 13 15:56:53.178347 kernel: PCI host bridge to bus 0000:00 Feb 13 15:56:53.178576 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 15:56:53.178798 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 15:56:53.178928 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 15:56:53.179037 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Feb 13 15:56:53.179187 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] Feb 13 15:56:53.179309 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Feb 13 15:56:53.179565 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Feb 13 15:56:53.179772 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Feb 13 15:56:53.179967 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Feb 13 15:56:53.182596 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc1e0-0xc1ef] Feb 13 15:56:53.182908 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 15:56:53.183096 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 15:56:53.183288 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 15:56:53.183457 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 15:56:53.183660 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Feb 13 15:56:53.183784 kernel: pci 0000:00:01.2: reg 0x20: [io 0xc180-0xc19f] Feb 13 15:56:53.183942 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Feb 13 15:56:53.184065 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Feb 13 15:56:53.186470 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Feb 13 15:56:53.186803 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Feb 13 15:56:53.187008 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Feb 13 15:56:53.189109 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Feb 13 15:56:53.189355 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfebf0000-0xfebf0fff] Feb 13 15:56:53.189525 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfebe0000-0xfebeffff pref] Feb 13 15:56:53.189689 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 15:56:53.189881 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Feb 13 15:56:53.190064 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc1a0-0xc1bf] Feb 13 15:56:53.190302 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfebf1000-0xfebf1fff] Feb 13 15:56:53.190456 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Feb 13 15:56:53.190649 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 Feb 13 15:56:53.190838 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc1c0-0xc1df] Feb 13 15:56:53.191001 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfebf2000-0xfebf2fff] Feb 13 15:56:53.191275 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Feb 13 15:56:53.191539 kernel: pci 0000:00:05.0: [1af4:1004] type 00 class 0x010000 Feb 13 15:56:53.191718 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc100-0xc13f] Feb 13 15:56:53.191887 kernel: pci 0000:00:05.0: reg 0x14: [mem 0xfebf3000-0xfebf3fff] Feb 13 15:56:53.192102 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Feb 13 15:56:53.192363 kernel: pci 0000:00:06.0: [1af4:1001] type 00 class 0x010000 Feb 13 15:56:53.192537 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc000-0xc07f] Feb 13 15:56:53.192695 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfebf4000-0xfebf4fff] Feb 13 15:56:53.192934 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Feb 13 15:56:53.193175 kernel: pci 0000:00:07.0: [1af4:1001] type 00 class 0x010000 Feb 13 15:56:53.193364 kernel: pci 0000:00:07.0: reg 0x10: [io 0xc080-0xc0ff] Feb 13 15:56:53.193529 kernel: pci 0000:00:07.0: reg 0x14: [mem 0xfebf5000-0xfebf5fff] Feb 13 15:56:53.193737 kernel: pci 0000:00:07.0: reg 0x20: [mem 0xfe814000-0xfe817fff 64bit pref] Feb 13 15:56:53.193933 kernel: pci 0000:00:08.0: [1af4:1002] type 00 class 0x00ff00 Feb 13 15:56:53.194141 kernel: pci 0000:00:08.0: reg 0x10: [io 0xc140-0xc17f] Feb 13 15:56:53.194311 kernel: pci 0000:00:08.0: reg 0x20: [mem 0xfe818000-0xfe81bfff 64bit pref] Feb 13 15:56:53.194329 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Feb 13 15:56:53.194345 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Feb 13 15:56:53.194360 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 15:56:53.194375 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Feb 13 15:56:53.194389 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Feb 13 15:56:53.194404 kernel: iommu: Default domain type: Translated Feb 13 15:56:53.194426 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 15:56:53.194440 kernel: PCI: Using ACPI for IRQ routing Feb 13 15:56:53.194455 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 15:56:53.194471 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Feb 13 15:56:53.194486 kernel: e820: reserve RAM buffer [mem 0x7ffdb000-0x7fffffff] Feb 13 15:56:53.194648 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Feb 13 15:56:53.194964 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Feb 13 15:56:53.195158 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 15:56:53.195179 kernel: vgaarb: loaded Feb 13 15:56:53.195213 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Feb 13 15:56:53.195227 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Feb 13 15:56:53.195252 kernel: clocksource: Switched to clocksource kvm-clock Feb 13 15:56:53.195267 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 15:56:53.195283 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 15:56:53.195298 kernel: pnp: PnP ACPI init Feb 13 15:56:53.195311 kernel: pnp: PnP ACPI: found 4 devices Feb 13 15:56:53.195324 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 15:56:53.195338 kernel: NET: Registered PF_INET protocol family Feb 13 15:56:53.195357 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 15:56:53.195371 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 15:56:53.195385 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 15:56:53.195401 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 15:56:53.195439 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 15:56:53.195455 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 15:56:53.195470 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 15:56:53.195485 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 15:56:53.195500 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 15:56:53.195519 kernel: NET: Registered PF_XDP protocol family Feb 13 15:56:53.195686 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 15:56:53.195828 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 15:56:53.195979 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 15:56:53.198172 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Feb 13 15:56:53.198430 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x17fffffff window] Feb 13 15:56:53.198620 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Feb 13 15:56:53.198817 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 15:56:53.198860 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Feb 13 15:56:53.199030 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x7a0 took 43243 usecs Feb 13 15:56:53.199051 kernel: PCI: CLS 0 bytes, default 64 Feb 13 15:56:53.199068 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 15:56:53.199085 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x3985c314e25, max_idle_ns: 881590612270 ns Feb 13 15:56:53.199102 kernel: Initialise system trusted keyrings Feb 13 15:56:53.199265 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 15:56:53.199281 kernel: Key type asymmetric registered Feb 13 15:56:53.199303 kernel: Asymmetric key parser 'x509' registered Feb 13 15:56:53.199315 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 15:56:53.199329 kernel: io scheduler mq-deadline registered Feb 13 15:56:53.199343 kernel: io scheduler kyber registered Feb 13 15:56:53.199356 kernel: io scheduler bfq registered Feb 13 15:56:53.199371 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 15:56:53.199406 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Feb 13 15:56:53.199421 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Feb 13 15:56:53.199435 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Feb 13 15:56:53.199447 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 15:56:53.199467 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 15:56:53.199480 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Feb 13 15:56:53.199495 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 15:56:53.199507 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 15:56:53.199521 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 15:56:53.199753 kernel: rtc_cmos 00:03: RTC can wake from S4 Feb 13 15:56:53.199901 kernel: rtc_cmos 00:03: registered as rtc0 Feb 13 15:56:53.200066 kernel: rtc_cmos 00:03: setting system clock to 2025-02-13T15:56:52 UTC (1739462212) Feb 13 15:56:53.201440 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Feb 13 15:56:53.201486 kernel: intel_pstate: CPU model not supported Feb 13 15:56:53.201503 kernel: NET: Registered PF_INET6 protocol family Feb 13 15:56:53.201517 kernel: Segment Routing with IPv6 Feb 13 15:56:53.201531 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 15:56:53.201546 kernel: NET: Registered PF_PACKET protocol family Feb 13 15:56:53.201560 kernel: Key type dns_resolver registered Feb 13 15:56:53.201575 kernel: IPI shorthand broadcast: enabled Feb 13 15:56:53.201600 kernel: sched_clock: Marking stable (1444004753, 174056610)->(1738890803, -120829440) Feb 13 15:56:53.201615 kernel: registered taskstats version 1 Feb 13 15:56:53.201630 kernel: Loading compiled-in X.509 certificates Feb 13 15:56:53.201646 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 3d19ae6dcd850c11d55bf09bd44e00c45ed399eb' Feb 13 15:56:53.201663 kernel: Key type .fscrypt registered Feb 13 15:56:53.201678 kernel: Key type fscrypt-provisioning registered Feb 13 15:56:53.201698 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 15:56:53.201713 kernel: ima: Allocated hash algorithm: sha1 Feb 13 15:56:53.201729 kernel: ima: No architecture policies found Feb 13 15:56:53.201749 kernel: clk: Disabling unused clocks Feb 13 15:56:53.201764 kernel: Freeing unused kernel image (initmem) memory: 43320K Feb 13 15:56:53.201779 kernel: Write protecting the kernel read-only data: 38912k Feb 13 15:56:53.201819 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Feb 13 15:56:53.201838 kernel: Run /init as init process Feb 13 15:56:53.201853 kernel: with arguments: Feb 13 15:56:53.201868 kernel: /init Feb 13 15:56:53.201882 kernel: with environment: Feb 13 15:56:53.201899 kernel: HOME=/ Feb 13 15:56:53.201917 kernel: TERM=linux Feb 13 15:56:53.201930 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 15:56:53.201949 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 15:56:53.201968 systemd[1]: Detected virtualization kvm. Feb 13 15:56:53.201982 systemd[1]: Detected architecture x86-64. Feb 13 15:56:53.201996 systemd[1]: Running in initrd. Feb 13 15:56:53.202010 systemd[1]: No hostname configured, using default hostname. Feb 13 15:56:53.202023 systemd[1]: Hostname set to . Feb 13 15:56:53.202041 systemd[1]: Initializing machine ID from VM UUID. Feb 13 15:56:53.202056 systemd[1]: Queued start job for default target initrd.target. Feb 13 15:56:53.202070 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:56:53.202088 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:56:53.202105 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 15:56:53.202140 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 15:56:53.202155 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 15:56:53.202176 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 15:56:53.202194 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 15:56:53.202213 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 15:56:53.202229 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:56:53.202266 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:56:53.202282 systemd[1]: Reached target paths.target - Path Units. Feb 13 15:56:53.202297 systemd[1]: Reached target slices.target - Slice Units. Feb 13 15:56:53.202321 systemd[1]: Reached target swap.target - Swaps. Feb 13 15:56:53.202340 systemd[1]: Reached target timers.target - Timer Units. Feb 13 15:56:53.202357 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 15:56:53.202370 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 15:56:53.202385 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 15:56:53.202402 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Feb 13 15:56:53.202421 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:56:53.202438 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 15:56:53.202453 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:56:53.202469 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 15:56:53.202485 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 15:56:53.202496 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 15:56:53.202506 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 15:56:53.202516 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 15:56:53.202530 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 15:56:53.202540 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 15:56:53.202549 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:56:53.202559 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 15:56:53.202624 systemd-journald[182]: Collecting audit messages is disabled. Feb 13 15:56:53.202658 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:56:53.202668 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 15:56:53.202702 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 15:56:53.202721 systemd-journald[182]: Journal started Feb 13 15:56:53.202766 systemd-journald[182]: Runtime Journal (/run/log/journal/d5bab27dbead42659f3f58e5321fce5e) is 4.9M, max 39.3M, 34.4M free. Feb 13 15:56:53.207153 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 15:56:53.210335 systemd-modules-load[183]: Inserted module 'overlay' Feb 13 15:56:53.260793 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 15:56:53.260833 kernel: Bridge firewalling registered Feb 13 15:56:53.248359 systemd-modules-load[183]: Inserted module 'br_netfilter' Feb 13 15:56:53.263298 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 15:56:53.264691 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:56:53.273905 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 15:56:53.286199 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:56:53.289618 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 15:56:53.299456 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 15:56:53.307977 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 15:56:53.325535 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:56:53.326842 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:56:53.328628 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:56:53.339480 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 15:56:53.341655 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:56:53.355539 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 15:56:53.378003 dracut-cmdline[216]: dracut-dracut-053 Feb 13 15:56:53.384035 dracut-cmdline[216]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=85b856728ac62eb775b23688185fbd191f36059b11eac7a7eacb2da5f3555b05 Feb 13 15:56:53.393842 systemd-resolved[218]: Positive Trust Anchors: Feb 13 15:56:53.393858 systemd-resolved[218]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 15:56:53.393895 systemd-resolved[218]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 15:56:53.398898 systemd-resolved[218]: Defaulting to hostname 'linux'. Feb 13 15:56:53.401148 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 15:56:53.403091 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:56:53.550208 kernel: SCSI subsystem initialized Feb 13 15:56:53.566197 kernel: Loading iSCSI transport class v2.0-870. Feb 13 15:56:53.591527 kernel: iscsi: registered transport (tcp) Feb 13 15:56:53.623543 kernel: iscsi: registered transport (qla4xxx) Feb 13 15:56:53.623707 kernel: QLogic iSCSI HBA Driver Feb 13 15:56:53.707474 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 15:56:53.715489 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 15:56:53.770808 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 15:56:53.770944 kernel: device-mapper: uevent: version 1.0.3 Feb 13 15:56:53.775448 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 15:56:53.836250 kernel: raid6: avx2x4 gen() 16262 MB/s Feb 13 15:56:53.853353 kernel: raid6: avx2x2 gen() 17733 MB/s Feb 13 15:56:53.871294 kernel: raid6: avx2x1 gen() 11264 MB/s Feb 13 15:56:53.871410 kernel: raid6: using algorithm avx2x2 gen() 17733 MB/s Feb 13 15:56:53.890621 kernel: raid6: .... xor() 12402 MB/s, rmw enabled Feb 13 15:56:53.890908 kernel: raid6: using avx2x2 recovery algorithm Feb 13 15:56:53.938199 kernel: xor: automatically using best checksumming function avx Feb 13 15:56:54.168885 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 15:56:54.201350 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 15:56:54.209564 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:56:54.247499 systemd-udevd[402]: Using default interface naming scheme 'v255'. Feb 13 15:56:54.253996 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:56:54.262356 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 15:56:54.294593 dracut-pre-trigger[410]: rd.md=0: removing MD RAID activation Feb 13 15:56:54.357639 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 15:56:54.364596 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 15:56:54.469257 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:56:54.480483 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 15:56:54.518896 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 15:56:54.522624 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 15:56:54.526312 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:56:54.530079 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 15:56:54.539658 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 15:56:54.588328 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 15:56:54.623377 kernel: virtio_blk virtio4: 1/0/0 default/read/poll queues Feb 13 15:56:54.724726 kernel: virtio_blk virtio4: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Feb 13 15:56:54.724977 kernel: scsi host0: Virtio SCSI HBA Feb 13 15:56:54.725236 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 15:56:54.725256 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 15:56:54.725274 kernel: ACPI: bus type USB registered Feb 13 15:56:54.725290 kernel: GPT:9289727 != 125829119 Feb 13 15:56:54.725311 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 15:56:54.725328 kernel: GPT:9289727 != 125829119 Feb 13 15:56:54.725341 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 15:56:54.725360 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 15:56:54.725376 kernel: usbcore: registered new interface driver usbfs Feb 13 15:56:54.725392 kernel: virtio_blk virtio5: 1/0/0 default/read/poll queues Feb 13 15:56:54.742005 kernel: virtio_blk virtio5: [vdb] 932 512-byte logical blocks (477 kB/466 KiB) Feb 13 15:56:54.742227 kernel: libata version 3.00 loaded. Feb 13 15:56:54.742250 kernel: usbcore: registered new interface driver hub Feb 13 15:56:54.742268 kernel: usbcore: registered new device driver usb Feb 13 15:56:54.680215 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 15:56:54.680424 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:56:54.684279 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:56:54.685175 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:56:54.685525 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:56:54.686788 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:56:54.697801 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:56:54.762347 kernel: ata_piix 0000:00:01.1: version 2.13 Feb 13 15:56:54.775950 kernel: scsi host1: ata_piix Feb 13 15:56:54.778927 kernel: scsi host2: ata_piix Feb 13 15:56:54.779229 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc1e0 irq 14 Feb 13 15:56:54.779255 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc1e8 irq 15 Feb 13 15:56:54.845575 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:56:54.863800 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:56:54.888602 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Feb 13 15:56:54.932616 kernel: BTRFS: device fsid 0e178e67-0100-48b1-87c9-422b9a68652a devid 1 transid 41 /dev/vda3 scanned by (udev-worker) (448) Feb 13 15:56:54.932652 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 15:56:54.932677 kernel: AES CTR mode by8 optimization enabled Feb 13 15:56:54.932689 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (452) Feb 13 15:56:54.960550 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Feb 13 15:56:54.973343 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Feb 13 15:56:54.976255 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Feb 13 15:56:54.990807 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Feb 13 15:56:54.994025 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:56:55.004627 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 15:56:55.022335 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Feb 13 15:56:55.031725 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Feb 13 15:56:55.031942 kernel: uhci_hcd 0000:00:01.2: detected 2 ports Feb 13 15:56:55.032189 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c180 Feb 13 15:56:55.032402 kernel: hub 1-0:1.0: USB hub found Feb 13 15:56:55.032669 kernel: hub 1-0:1.0: 2 ports detected Feb 13 15:56:55.034429 disk-uuid[550]: Primary Header is updated. Feb 13 15:56:55.034429 disk-uuid[550]: Secondary Entries is updated. Feb 13 15:56:55.034429 disk-uuid[550]: Secondary Header is updated. Feb 13 15:56:55.041161 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 15:56:55.051335 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 15:56:56.059290 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 15:56:56.059881 disk-uuid[551]: The operation has completed successfully. Feb 13 15:56:56.168445 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 15:56:56.168621 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 15:56:56.179849 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 15:56:56.195851 sh[562]: Success Feb 13 15:56:56.226249 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 15:56:56.321005 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 15:56:56.323384 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 15:56:56.339402 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 15:56:56.372204 kernel: BTRFS info (device dm-0): first mount of filesystem 0e178e67-0100-48b1-87c9-422b9a68652a Feb 13 15:56:56.372322 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:56:56.374857 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 15:56:56.377889 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 15:56:56.378017 kernel: BTRFS info (device dm-0): using free space tree Feb 13 15:56:56.397540 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 15:56:56.399380 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Feb 13 15:56:56.406507 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 15:56:56.411451 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 15:56:56.428544 kernel: BTRFS info (device vda6): first mount of filesystem c26baa82-37e4-4435-b3ec-4748612bc475 Feb 13 15:56:56.428646 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:56:56.428667 kernel: BTRFS info (device vda6): using free space tree Feb 13 15:56:56.443221 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 15:56:56.462352 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 15:56:56.466268 kernel: BTRFS info (device vda6): last unmount of filesystem c26baa82-37e4-4435-b3ec-4748612bc475 Feb 13 15:56:56.483463 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 15:56:56.492524 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 15:56:56.623168 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 15:56:56.635360 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 15:56:56.691880 ignition[662]: Ignition 2.20.0 Feb 13 15:56:56.691920 ignition[662]: Stage: fetch-offline Feb 13 15:56:56.692018 ignition[662]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:56:56.692039 ignition[662]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 15:56:56.692396 ignition[662]: parsed url from cmdline: "" Feb 13 15:56:56.692405 ignition[662]: no config URL provided Feb 13 15:56:56.692416 ignition[662]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 15:56:56.692431 ignition[662]: no config at "/usr/lib/ignition/user.ign" Feb 13 15:56:56.692441 ignition[662]: failed to fetch config: resource requires networking Feb 13 15:56:56.692780 ignition[662]: Ignition finished successfully Feb 13 15:56:56.701289 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 15:56:56.717138 systemd-networkd[748]: lo: Link UP Feb 13 15:56:56.717156 systemd-networkd[748]: lo: Gained carrier Feb 13 15:56:56.721711 systemd-networkd[748]: Enumeration completed Feb 13 15:56:56.721962 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 15:56:56.723048 systemd-networkd[748]: eth0: found matching network '/usr/lib/systemd/network/yy-digitalocean.network', based on potentially unpredictable interface name. Feb 13 15:56:56.723056 systemd-networkd[748]: eth0: Configuring with /usr/lib/systemd/network/yy-digitalocean.network. Feb 13 15:56:56.723172 systemd[1]: Reached target network.target - Network. Feb 13 15:56:56.724598 systemd-networkd[748]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 15:56:56.724605 systemd-networkd[748]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 15:56:56.725883 systemd-networkd[748]: eth0: Link UP Feb 13 15:56:56.725891 systemd-networkd[748]: eth0: Gained carrier Feb 13 15:56:56.725908 systemd-networkd[748]: eth0: found matching network '/usr/lib/systemd/network/yy-digitalocean.network', based on potentially unpredictable interface name. Feb 13 15:56:56.735964 systemd-networkd[748]: eth1: Link UP Feb 13 15:56:56.735970 systemd-networkd[748]: eth1: Gained carrier Feb 13 15:56:56.735990 systemd-networkd[748]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 15:56:56.737836 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Feb 13 15:56:56.756856 systemd-networkd[748]: eth1: DHCPv4 address 10.124.0.2/20 acquired from 169.254.169.253 Feb 13 15:56:56.763889 ignition[754]: Ignition 2.20.0 Feb 13 15:56:56.763908 ignition[754]: Stage: fetch Feb 13 15:56:56.764294 systemd-networkd[748]: eth0: DHCPv4 address 143.198.68.221/20, gateway 143.198.64.1 acquired from 169.254.169.253 Feb 13 15:56:56.764759 ignition[754]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:56:56.764777 ignition[754]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 15:56:56.765012 ignition[754]: parsed url from cmdline: "" Feb 13 15:56:56.765018 ignition[754]: no config URL provided Feb 13 15:56:56.765029 ignition[754]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 15:56:56.765042 ignition[754]: no config at "/usr/lib/ignition/user.ign" Feb 13 15:56:56.765081 ignition[754]: GET http://169.254.169.254/metadata/v1/user-data: attempt #1 Feb 13 15:56:56.793509 ignition[754]: GET result: OK Feb 13 15:56:56.793686 ignition[754]: parsing config with SHA512: 7c7334817db3b2dbc27fa201b278fd4b0745ddf5800aa45b7d206009996fcefd7fc923d79d7c520eedd641ebb60370cc2d94654df299673f82cfe97597a26508 Feb 13 15:56:56.804113 unknown[754]: fetched base config from "system" Feb 13 15:56:56.804164 unknown[754]: fetched base config from "system" Feb 13 15:56:56.804175 unknown[754]: fetched user config from "digitalocean" Feb 13 15:56:56.805621 ignition[754]: fetch: fetch complete Feb 13 15:56:56.805633 ignition[754]: fetch: fetch passed Feb 13 15:56:56.805742 ignition[754]: Ignition finished successfully Feb 13 15:56:56.808210 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Feb 13 15:56:56.823755 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 15:56:56.860060 ignition[761]: Ignition 2.20.0 Feb 13 15:56:56.860081 ignition[761]: Stage: kargs Feb 13 15:56:56.860525 ignition[761]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:56:56.860546 ignition[761]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 15:56:56.865320 ignition[761]: kargs: kargs passed Feb 13 15:56:56.865446 ignition[761]: Ignition finished successfully Feb 13 15:56:56.868779 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 15:56:56.883588 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 15:56:56.911798 ignition[768]: Ignition 2.20.0 Feb 13 15:56:56.911817 ignition[768]: Stage: disks Feb 13 15:56:56.912234 ignition[768]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:56:56.912264 ignition[768]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 15:56:56.920489 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 15:56:56.915005 ignition[768]: disks: disks passed Feb 13 15:56:56.944606 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 15:56:56.917656 ignition[768]: Ignition finished successfully Feb 13 15:56:56.948882 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 15:56:56.950472 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 15:56:56.952777 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 15:56:56.953429 systemd[1]: Reached target basic.target - Basic System. Feb 13 15:56:56.961526 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 15:56:56.996326 systemd-fsck[777]: ROOT: clean, 14/553520 files, 52654/553472 blocks Feb 13 15:56:57.001373 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 15:56:57.009728 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 15:56:57.168160 kernel: EXT4-fs (vda9): mounted filesystem e45e00fd-a630-4f0f-91bb-bc879e42a47e r/w with ordered data mode. Quota mode: none. Feb 13 15:56:57.169838 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 15:56:57.171402 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 15:56:57.177331 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 15:56:57.191572 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 15:56:57.196956 systemd[1]: Starting flatcar-afterburn-network.service - Flatcar Afterburn network service... Feb 13 15:56:57.201101 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Feb 13 15:56:57.205080 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 15:56:57.205164 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 15:56:57.211615 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 15:56:57.226296 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 15:56:57.230677 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (786) Feb 13 15:56:57.235110 kernel: BTRFS info (device vda6): first mount of filesystem c26baa82-37e4-4435-b3ec-4748612bc475 Feb 13 15:56:57.239429 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:56:57.239562 kernel: BTRFS info (device vda6): using free space tree Feb 13 15:56:57.252237 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 15:56:57.275965 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 15:56:57.349530 coreos-metadata[789]: Feb 13 15:56:57.348 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 15:56:57.360472 initrd-setup-root[816]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 15:56:57.366616 coreos-metadata[789]: Feb 13 15:56:57.366 INFO Fetch successful Feb 13 15:56:57.367791 coreos-metadata[788]: Feb 13 15:56:57.366 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 15:56:57.373755 coreos-metadata[789]: Feb 13 15:56:57.373 INFO wrote hostname ci-4186.1.1-1-7a196a8365 to /sysroot/etc/hostname Feb 13 15:56:57.376334 initrd-setup-root[823]: cut: /sysroot/etc/group: No such file or directory Feb 13 15:56:57.377814 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Feb 13 15:56:57.381929 coreos-metadata[788]: Feb 13 15:56:57.380 INFO Fetch successful Feb 13 15:56:57.387633 initrd-setup-root[831]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 15:56:57.392526 systemd[1]: flatcar-afterburn-network.service: Deactivated successfully. Feb 13 15:56:57.395493 systemd[1]: Finished flatcar-afterburn-network.service - Flatcar Afterburn network service. Feb 13 15:56:57.400830 initrd-setup-root[838]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 15:56:57.568587 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 15:56:57.575386 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 15:56:57.578492 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 15:56:57.597682 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 15:56:57.598824 kernel: BTRFS info (device vda6): last unmount of filesystem c26baa82-37e4-4435-b3ec-4748612bc475 Feb 13 15:56:57.623395 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 15:56:57.647266 ignition[907]: INFO : Ignition 2.20.0 Feb 13 15:56:57.647266 ignition[907]: INFO : Stage: mount Feb 13 15:56:57.650031 ignition[907]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:56:57.650031 ignition[907]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 15:56:57.650031 ignition[907]: INFO : mount: mount passed Feb 13 15:56:57.650031 ignition[907]: INFO : Ignition finished successfully Feb 13 15:56:57.651834 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 15:56:57.672491 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 15:56:57.693479 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 15:56:57.704181 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (918) Feb 13 15:56:57.707219 kernel: BTRFS info (device vda6): first mount of filesystem c26baa82-37e4-4435-b3ec-4748612bc475 Feb 13 15:56:57.707344 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:56:57.708496 kernel: BTRFS info (device vda6): using free space tree Feb 13 15:56:57.714199 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 15:56:57.719363 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 15:56:57.764316 ignition[935]: INFO : Ignition 2.20.0 Feb 13 15:56:57.764316 ignition[935]: INFO : Stage: files Feb 13 15:56:57.764316 ignition[935]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:56:57.764316 ignition[935]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 15:56:57.769742 ignition[935]: DEBUG : files: compiled without relabeling support, skipping Feb 13 15:56:57.769742 ignition[935]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 15:56:57.769742 ignition[935]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 15:56:57.773707 ignition[935]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 15:56:57.774624 ignition[935]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 15:56:57.776018 ignition[935]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 15:56:57.774682 unknown[935]: wrote ssh authorized keys file for user: core Feb 13 15:56:57.778182 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/home/core/install.sh" Feb 13 15:56:57.778182 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 15:56:57.780363 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 15:56:57.780363 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 15:56:57.780363 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:56:57.780363 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:56:57.780363 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:56:57.780363 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Feb 13 15:56:58.017601 systemd-networkd[748]: eth1: Gained IPv6LL Feb 13 15:56:58.149902 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 15:56:58.529395 systemd-networkd[748]: eth0: Gained IPv6LL Feb 13 15:56:58.543193 ignition[935]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:56:58.544784 ignition[935]: INFO : files: createResultFile: createFiles: op(7): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 15:56:58.544784 ignition[935]: INFO : files: createResultFile: createFiles: op(7): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 15:56:58.544784 ignition[935]: INFO : files: files passed Feb 13 15:56:58.544784 ignition[935]: INFO : Ignition finished successfully Feb 13 15:56:58.546046 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 15:56:58.552532 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 15:56:58.556464 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 15:56:58.572230 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 15:56:58.572407 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 15:56:58.585474 initrd-setup-root-after-ignition[964]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:56:58.585474 initrd-setup-root-after-ignition[964]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:56:58.587725 initrd-setup-root-after-ignition[968]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:56:58.591079 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 15:56:58.593004 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 15:56:58.601552 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 15:56:58.642357 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 15:56:58.642588 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 15:56:58.649718 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 15:56:58.650776 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 15:56:58.652769 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 15:56:58.658514 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 15:56:58.680953 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 15:56:58.690552 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 15:56:58.715465 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:56:58.716614 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:56:58.717904 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 15:56:58.719351 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 15:56:58.719569 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 15:56:58.720956 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 15:56:58.721823 systemd[1]: Stopped target basic.target - Basic System. Feb 13 15:56:58.723192 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 15:56:58.724546 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 15:56:58.725800 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 15:56:58.727244 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 15:56:58.727962 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 15:56:58.729740 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 15:56:58.731444 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 15:56:58.732610 systemd[1]: Stopped target swap.target - Swaps. Feb 13 15:56:58.734104 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 15:56:58.734372 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 15:56:58.736728 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:56:58.737752 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:56:58.739278 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 15:56:58.739734 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:56:58.740864 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 15:56:58.741187 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 15:56:58.746987 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 15:56:58.747314 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 15:56:58.748578 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 15:56:58.748825 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 15:56:58.749846 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Feb 13 15:56:58.750150 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Feb 13 15:56:58.759692 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 15:56:58.761628 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 15:56:58.763304 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:56:58.769003 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 15:56:58.775612 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 15:56:58.775962 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:56:58.779015 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 15:56:58.781385 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 15:56:58.793687 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 15:56:58.793867 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 15:56:58.803174 ignition[988]: INFO : Ignition 2.20.0 Feb 13 15:56:58.803174 ignition[988]: INFO : Stage: umount Feb 13 15:56:58.803174 ignition[988]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:56:58.803174 ignition[988]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 15:56:58.808486 ignition[988]: INFO : umount: umount passed Feb 13 15:56:58.808486 ignition[988]: INFO : Ignition finished successfully Feb 13 15:56:58.807760 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 15:56:58.807947 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 15:56:58.809879 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 15:56:58.810041 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 15:56:58.812995 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 15:56:58.813108 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 15:56:58.815982 systemd[1]: ignition-fetch.service: Deactivated successfully. Feb 13 15:56:58.816088 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Feb 13 15:56:58.817669 systemd[1]: Stopped target network.target - Network. Feb 13 15:56:58.820519 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 15:56:58.820763 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 15:56:58.823368 systemd[1]: Stopped target paths.target - Path Units. Feb 13 15:56:58.827408 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 15:56:58.834254 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:56:58.836641 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 15:56:58.839317 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 15:56:58.842109 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 15:56:58.842232 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 15:56:58.843930 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 15:56:58.844012 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 15:56:58.845447 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 15:56:58.845564 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 15:56:58.846540 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 15:56:58.846606 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 15:56:58.848275 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 15:56:58.855260 systemd-networkd[748]: eth1: DHCPv6 lease lost Feb 13 15:56:58.857313 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 15:56:58.860379 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 15:56:58.861287 systemd-networkd[748]: eth0: DHCPv6 lease lost Feb 13 15:56:58.862306 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 15:56:58.862508 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 15:56:58.864718 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 15:56:58.864849 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 15:56:58.865884 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 15:56:58.866015 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 15:56:58.870670 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 15:56:58.870796 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:56:58.873198 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 15:56:58.873334 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 15:56:58.881350 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 15:56:58.883068 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 15:56:58.883228 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 15:56:58.884793 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 15:56:58.884884 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:56:58.886993 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 15:56:58.887084 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 15:56:58.888452 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 15:56:58.888541 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:56:58.892312 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:56:58.916355 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 15:56:58.916600 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:56:58.918324 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 15:56:58.918427 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 15:56:58.919711 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 15:56:58.919756 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:56:58.921286 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 15:56:58.921354 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 15:56:58.923679 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 15:56:58.923768 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 15:56:58.927339 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 15:56:58.927457 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:56:58.937475 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 15:56:58.939406 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 15:56:58.939535 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:56:58.941388 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:56:58.941493 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:56:58.944210 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 15:56:58.944338 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 15:56:58.946989 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 15:56:58.947169 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 15:56:58.949909 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 15:56:58.958544 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 15:56:58.973168 systemd[1]: Switching root. Feb 13 15:56:59.062545 systemd-journald[182]: Journal stopped Feb 13 15:57:00.856726 systemd-journald[182]: Received SIGTERM from PID 1 (systemd). Feb 13 15:57:00.856874 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 15:57:00.856900 kernel: SELinux: policy capability open_perms=1 Feb 13 15:57:00.856921 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 15:57:00.856940 kernel: SELinux: policy capability always_check_network=0 Feb 13 15:57:00.856959 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 15:57:00.856980 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 15:57:00.856999 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 15:57:00.857021 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 15:57:00.857046 kernel: audit: type=1403 audit(1739462219.363:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 15:57:00.857074 systemd[1]: Successfully loaded SELinux policy in 63.100ms. Feb 13 15:57:00.857105 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 15.026ms. Feb 13 15:57:00.860992 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 15:57:00.861053 systemd[1]: Detected virtualization kvm. Feb 13 15:57:00.861077 systemd[1]: Detected architecture x86-64. Feb 13 15:57:00.861098 systemd[1]: Detected first boot. Feb 13 15:57:00.861138 systemd[1]: Hostname set to . Feb 13 15:57:00.866407 systemd[1]: Initializing machine ID from VM UUID. Feb 13 15:57:00.866442 zram_generator::config[1030]: No configuration found. Feb 13 15:57:00.866465 systemd[1]: Populated /etc with preset unit settings. Feb 13 15:57:00.866486 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 15:57:00.866506 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 15:57:00.866528 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 15:57:00.866549 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 15:57:00.866579 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 15:57:00.866603 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 15:57:00.866622 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 15:57:00.866641 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 15:57:00.866660 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 15:57:00.866680 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 15:57:00.866716 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 15:57:00.866734 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:57:00.866753 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:57:00.866772 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 15:57:00.866795 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 15:57:00.866815 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 15:57:00.866835 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 15:57:00.866854 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 15:57:00.866873 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:57:00.866893 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 15:57:00.866912 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 15:57:00.866935 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 15:57:00.866955 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 15:57:00.866974 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:57:00.867001 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 15:57:00.867021 systemd[1]: Reached target slices.target - Slice Units. Feb 13 15:57:00.867042 systemd[1]: Reached target swap.target - Swaps. Feb 13 15:57:00.867062 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 15:57:00.867080 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 15:57:00.867102 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:57:00.867151 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 15:57:00.867171 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:57:00.867190 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 15:57:00.867209 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 15:57:00.867230 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 15:57:00.867249 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 15:57:00.867269 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:00.867291 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 15:57:00.867317 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 15:57:00.867337 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 15:57:00.867357 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 15:57:00.867389 systemd[1]: Reached target machines.target - Containers. Feb 13 15:57:00.867408 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 15:57:00.867427 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 15:57:00.867447 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 15:57:00.867465 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 15:57:00.867490 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:57:00.867509 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 15:57:00.867533 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 15:57:00.867552 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 15:57:00.867569 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 15:57:00.867595 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 15:57:00.867613 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 15:57:00.867631 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 15:57:00.867652 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 15:57:00.867670 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 15:57:00.867689 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 15:57:00.867706 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 15:57:00.867724 kernel: loop: module loaded Feb 13 15:57:00.867744 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 15:57:00.867761 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 15:57:00.867780 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 15:57:00.867799 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 15:57:00.867816 systemd[1]: Stopped verity-setup.service. Feb 13 15:57:00.867839 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:00.867856 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 15:57:00.867874 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 15:57:00.867890 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 15:57:00.867909 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 15:57:00.867932 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 15:57:00.867950 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 15:57:00.867969 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:57:00.867988 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 15:57:00.868006 kernel: ACPI: bus type drm_connector registered Feb 13 15:57:00.868025 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 15:57:00.868042 kernel: fuse: init (API version 7.39) Feb 13 15:57:00.868066 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:57:00.868084 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:57:00.868102 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 15:57:00.870260 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 15:57:00.870311 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 15:57:00.870331 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 15:57:00.870364 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 15:57:00.870383 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 15:57:00.870402 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 15:57:00.870420 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 15:57:00.870437 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 15:57:00.870458 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 15:57:00.870477 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 15:57:00.870496 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 15:57:00.870515 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 15:57:00.870606 systemd-journald[1103]: Collecting audit messages is disabled. Feb 13 15:57:00.870647 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 15:57:00.870668 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 15:57:00.870687 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 15:57:00.870724 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Feb 13 15:57:00.870747 systemd-journald[1103]: Journal started Feb 13 15:57:00.870797 systemd-journald[1103]: Runtime Journal (/run/log/journal/d5bab27dbead42659f3f58e5321fce5e) is 4.9M, max 39.3M, 34.4M free. Feb 13 15:57:00.279870 systemd[1]: Queued start job for default target multi-user.target. Feb 13 15:57:00.311459 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Feb 13 15:57:00.322606 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 15:57:00.880770 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 15:57:00.896142 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 15:57:00.896293 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:57:00.908979 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 15:57:00.914156 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 15:57:00.925644 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 15:57:00.925815 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 15:57:00.945160 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 15:57:00.954171 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 15:57:00.965309 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 15:57:00.963666 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 15:57:00.964774 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 15:57:00.968778 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 15:57:00.972002 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 15:57:01.062451 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 15:57:01.074609 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 15:57:01.088886 kernel: loop0: detected capacity change from 0 to 138184 Feb 13 15:57:01.079368 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:57:01.081014 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 15:57:01.087405 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 15:57:01.103749 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Feb 13 15:57:01.118813 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 15:57:01.197277 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 15:57:01.211297 systemd-journald[1103]: Time spent on flushing to /var/log/journal/d5bab27dbead42659f3f58e5321fce5e is 66.950ms for 977 entries. Feb 13 15:57:01.211297 systemd-journald[1103]: System Journal (/var/log/journal/d5bab27dbead42659f3f58e5321fce5e) is 8.0M, max 195.6M, 187.6M free. Feb 13 15:57:01.316587 systemd-journald[1103]: Received client request to flush runtime journal. Feb 13 15:57:01.316684 kernel: loop1: detected capacity change from 0 to 8 Feb 13 15:57:01.230714 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:57:01.262646 udevadm[1159]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 15:57:01.332384 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 15:57:01.341792 kernel: loop2: detected capacity change from 0 to 141000 Feb 13 15:57:01.349112 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 15:57:01.357467 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Feb 13 15:57:01.412199 kernel: loop3: detected capacity change from 0 to 210664 Feb 13 15:57:01.426579 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 15:57:01.439559 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 15:57:01.511193 kernel: loop4: detected capacity change from 0 to 138184 Feb 13 15:57:01.564143 kernel: loop5: detected capacity change from 0 to 8 Feb 13 15:57:01.564304 kernel: loop6: detected capacity change from 0 to 141000 Feb 13 15:57:01.589171 systemd-tmpfiles[1172]: ACLs are not supported, ignoring. Feb 13 15:57:01.589202 systemd-tmpfiles[1172]: ACLs are not supported, ignoring. Feb 13 15:57:01.611419 kernel: loop7: detected capacity change from 0 to 210664 Feb 13 15:57:01.638071 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:57:01.654239 (sd-merge)[1175]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-digitalocean'. Feb 13 15:57:01.655280 (sd-merge)[1175]: Merged extensions into '/usr'. Feb 13 15:57:01.680369 systemd[1]: Reloading requested from client PID 1132 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 15:57:01.680398 systemd[1]: Reloading... Feb 13 15:57:01.943164 zram_generator::config[1205]: No configuration found. Feb 13 15:57:02.225743 ldconfig[1128]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 15:57:02.279337 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:57:02.357357 systemd[1]: Reloading finished in 676 ms. Feb 13 15:57:02.402741 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 15:57:02.409993 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 15:57:02.424678 systemd[1]: Starting ensure-sysext.service... Feb 13 15:57:02.437532 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 15:57:02.456519 systemd[1]: Reloading requested from client PID 1245 ('systemctl') (unit ensure-sysext.service)... Feb 13 15:57:02.456558 systemd[1]: Reloading... Feb 13 15:57:02.519465 systemd-tmpfiles[1246]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 15:57:02.519840 systemd-tmpfiles[1246]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 15:57:02.526346 systemd-tmpfiles[1246]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 15:57:02.526933 systemd-tmpfiles[1246]: ACLs are not supported, ignoring. Feb 13 15:57:02.527007 systemd-tmpfiles[1246]: ACLs are not supported, ignoring. Feb 13 15:57:02.538969 systemd-tmpfiles[1246]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 15:57:02.538990 systemd-tmpfiles[1246]: Skipping /boot Feb 13 15:57:02.608755 systemd-tmpfiles[1246]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 15:57:02.608781 systemd-tmpfiles[1246]: Skipping /boot Feb 13 15:57:02.707181 zram_generator::config[1279]: No configuration found. Feb 13 15:57:02.939147 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:57:03.029704 systemd[1]: Reloading finished in 572 ms. Feb 13 15:57:03.057751 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 15:57:03.070202 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:57:03.086877 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 15:57:03.097592 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 15:57:03.108477 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 15:57:03.117393 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 15:57:03.127671 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:57:03.141454 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 15:57:03.153968 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.154374 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 15:57:03.173812 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:57:03.181287 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 15:57:03.191015 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 15:57:03.193809 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:57:03.203970 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.224746 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 15:57:03.226394 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:57:03.226888 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:57:03.233936 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.234411 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 15:57:03.240669 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:57:03.241737 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:57:03.241977 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.243234 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 15:57:03.250772 systemd-udevd[1328]: Using default interface naming scheme 'v255'. Feb 13 15:57:03.254092 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.254554 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 15:57:03.262716 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 15:57:03.263736 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:57:03.264033 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.267556 systemd[1]: Finished ensure-sysext.service. Feb 13 15:57:03.285440 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 15:57:03.311589 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 15:57:03.314207 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 15:57:03.327524 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 15:57:03.329343 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 15:57:03.329894 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 15:57:03.332429 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 15:57:03.334663 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:57:03.353571 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 15:57:03.405679 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:57:03.406227 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:57:03.451517 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 15:57:03.452861 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 15:57:03.458089 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 15:57:03.460670 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 15:57:03.462359 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 15:57:03.476696 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 15:57:03.493761 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 15:57:03.497488 augenrules[1377]: No rules Feb 13 15:57:03.499738 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 15:57:03.500574 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 15:57:03.551452 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 15:57:03.583487 systemd[1]: Mounting media-configdrive.mount - /media/configdrive... Feb 13 15:57:03.584539 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.584747 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 15:57:03.595645 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:57:03.610482 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 15:57:03.619517 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 15:57:03.622604 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:57:03.622853 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 15:57:03.622922 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:57:03.664187 kernel: ISO 9660 Extensions: RRIP_1991A Feb 13 15:57:03.670320 systemd[1]: Mounted media-configdrive.mount - /media/configdrive. Feb 13 15:57:03.680598 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 15:57:03.681390 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 15:57:03.722795 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:57:03.723184 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:57:03.725644 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 15:57:03.738672 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 15:57:03.739089 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 15:57:03.741890 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 15:57:03.768480 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 15:57:03.809309 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (1357) Feb 13 15:57:03.827392 systemd-resolved[1326]: Positive Trust Anchors: Feb 13 15:57:03.827415 systemd-resolved[1326]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 15:57:03.827475 systemd-resolved[1326]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 15:57:03.837441 systemd-resolved[1326]: Using system hostname 'ci-4186.1.1-1-7a196a8365'. Feb 13 15:57:03.846337 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 15:57:03.847511 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:57:03.899519 systemd-networkd[1360]: lo: Link UP Feb 13 15:57:03.899534 systemd-networkd[1360]: lo: Gained carrier Feb 13 15:57:03.905479 systemd-networkd[1360]: Enumeration completed Feb 13 15:57:03.905679 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 15:57:03.906576 systemd[1]: Reached target network.target - Network. Feb 13 15:57:03.907916 systemd-networkd[1360]: eth0: Configuring with /run/systemd/network/10-82:72:b9:42:f6:7a.network. Feb 13 15:57:03.909400 systemd-networkd[1360]: eth1: Configuring with /run/systemd/network/10-c6:f0:ab:71:93:ce.network. Feb 13 15:57:03.914431 systemd-networkd[1360]: eth0: Link UP Feb 13 15:57:03.914448 systemd-networkd[1360]: eth0: Gained carrier Feb 13 15:57:03.919430 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 15:57:03.920670 systemd-networkd[1360]: eth1: Link UP Feb 13 15:57:03.920683 systemd-networkd[1360]: eth1: Gained carrier Feb 13 15:57:03.953340 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 15:57:03.954592 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 15:57:03.968999 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Feb 13 15:57:03.982690 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 15:57:04.000754 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Feb 13 15:57:04.021661 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 15:57:04.027154 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Feb 13 15:57:04.035152 kernel: ACPI: button: Power Button [PWRF] Feb 13 15:57:04.053550 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Feb 13 15:57:04.130155 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 15:57:04.178872 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:57:04.220156 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Feb 13 15:57:04.223189 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Feb 13 15:57:04.236779 kernel: Console: switching to colour dummy device 80x25 Feb 13 15:57:04.236938 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Feb 13 15:57:04.236964 kernel: [drm] features: -context_init Feb 13 15:57:04.256164 kernel: [drm] number of scanouts: 1 Feb 13 15:57:04.256319 kernel: [drm] number of cap sets: 0 Feb 13 15:57:04.271899 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:57:04.272713 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:57:04.289231 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Feb 13 15:57:04.289383 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Feb 13 15:57:04.297678 kernel: Console: switching to colour frame buffer device 128x48 Feb 13 15:57:04.299161 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Feb 13 15:57:04.303549 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:57:04.354490 systemd-timesyncd[1346]: Contacted time server 198.46.254.130:123 (0.flatcar.pool.ntp.org). Feb 13 15:57:04.355611 systemd-timesyncd[1346]: Initial clock synchronization to Thu 2025-02-13 15:57:04.690861 UTC. Feb 13 15:57:04.378554 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:57:04.379822 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:57:04.397757 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:57:04.506925 kernel: EDAC MC: Ver: 3.0.0 Feb 13 15:57:04.543431 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 15:57:04.559426 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 15:57:04.593154 lvm[1433]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 15:57:04.594942 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:57:04.653575 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 15:57:04.657694 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:57:04.658386 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 15:57:04.659026 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 15:57:04.659795 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 15:57:04.668190 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 15:57:04.668914 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 15:57:04.669096 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 15:57:04.670324 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 15:57:04.670379 systemd[1]: Reached target paths.target - Path Units. Feb 13 15:57:04.670496 systemd[1]: Reached target timers.target - Timer Units. Feb 13 15:57:04.674830 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 15:57:04.679869 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 15:57:04.701060 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 15:57:04.727062 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 15:57:04.747229 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 15:57:04.757089 lvm[1439]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 15:57:04.757842 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 15:57:04.758598 systemd[1]: Reached target basic.target - Basic System. Feb 13 15:57:04.761823 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 15:57:04.762069 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 15:57:04.774563 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 15:57:04.790564 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Feb 13 15:57:04.824649 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 15:57:04.842358 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 15:57:04.876540 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 15:57:04.886325 jq[1445]: false Feb 13 15:57:04.886382 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 15:57:04.906379 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 15:57:04.914413 coreos-metadata[1441]: Feb 13 15:57:04.913 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 15:57:04.921060 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 15:57:04.941281 coreos-metadata[1441]: Feb 13 15:57:04.938 INFO Fetch successful Feb 13 15:57:04.939080 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 15:57:04.972502 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 15:57:04.975610 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 15:57:04.977769 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 15:57:04.982561 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 15:57:04.991398 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 15:57:04.995203 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 15:57:05.003668 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 15:57:05.005949 dbus-daemon[1442]: [system] SELinux support is enabled Feb 13 15:57:05.003953 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 15:57:05.009000 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 15:57:05.022048 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 15:57:05.022976 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 15:57:05.059675 extend-filesystems[1446]: Found loop4 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found loop5 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found loop6 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found loop7 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda1 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda2 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda3 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found usr Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda4 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda6 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda7 Feb 13 15:57:05.069887 extend-filesystems[1446]: Found vda9 Feb 13 15:57:05.069887 extend-filesystems[1446]: Checking size of /dev/vda9 Feb 13 15:57:05.126857 jq[1456]: true Feb 13 15:57:05.075923 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 15:57:05.076290 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 15:57:05.109653 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 15:57:05.109783 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 15:57:05.121524 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 15:57:05.121693 systemd[1]: user-configdrive.service - Load cloud-config from /media/configdrive was skipped because of an unmet condition check (ConditionKernelCommandLine=!flatcar.oem.id=digitalocean). Feb 13 15:57:05.121732 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 15:57:05.154273 update_engine[1454]: I20250213 15:57:05.153378 1454 main.cc:92] Flatcar Update Engine starting Feb 13 15:57:05.162056 update_engine[1454]: I20250213 15:57:05.161785 1454 update_check_scheduler.cc:74] Next update check in 7m4s Feb 13 15:57:05.181773 systemd[1]: Started update-engine.service - Update Engine. Feb 13 15:57:05.181922 (ntainerd)[1468]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 15:57:05.190525 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 15:57:05.196783 extend-filesystems[1446]: Resized partition /dev/vda9 Feb 13 15:57:05.205213 extend-filesystems[1482]: resize2fs 1.47.1 (20-May-2024) Feb 13 15:57:05.224241 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 15121403 blocks Feb 13 15:57:05.237291 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Feb 13 15:57:05.237645 jq[1475]: true Feb 13 15:57:05.267740 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 15:57:05.345846 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (1368) Feb 13 15:57:05.573804 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Feb 13 15:57:05.649539 systemd-networkd[1360]: eth0: Gained IPv6LL Feb 13 15:57:05.665472 extend-filesystems[1482]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Feb 13 15:57:05.665472 extend-filesystems[1482]: old_desc_blocks = 1, new_desc_blocks = 8 Feb 13 15:57:05.665472 extend-filesystems[1482]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Feb 13 15:57:05.719860 extend-filesystems[1446]: Resized filesystem in /dev/vda9 Feb 13 15:57:05.719860 extend-filesystems[1446]: Found vdb Feb 13 15:57:05.752033 bash[1504]: Updated "/home/core/.ssh/authorized_keys" Feb 13 15:57:05.674303 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 15:57:05.690925 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 15:57:05.708405 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:05.726389 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 15:57:05.761114 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 15:57:05.763307 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 15:57:05.767503 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 15:57:05.785531 systemd-logind[1453]: New seat seat0. Feb 13 15:57:05.792808 systemd-logind[1453]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 15:57:05.798703 systemd[1]: Starting sshkeys.service... Feb 13 15:57:05.811267 systemd-logind[1453]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 15:57:05.818674 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 15:57:05.832426 systemd-networkd[1360]: eth1: Gained IPv6LL Feb 13 15:57:05.945473 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Feb 13 15:57:05.994786 locksmithd[1481]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 15:57:05.995915 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Feb 13 15:57:06.005416 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 15:57:06.154351 coreos-metadata[1521]: Feb 13 15:57:06.154 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 15:57:06.173182 coreos-metadata[1521]: Feb 13 15:57:06.172 INFO Fetch successful Feb 13 15:57:06.192094 unknown[1521]: wrote ssh authorized keys file for user: core Feb 13 15:57:06.243140 update-ssh-keys[1527]: Updated "/home/core/.ssh/authorized_keys" Feb 13 15:57:06.246339 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Feb 13 15:57:06.251733 systemd[1]: Finished sshkeys.service. Feb 13 15:57:06.331212 containerd[1468]: time="2025-02-13T15:57:06.329063902Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 15:57:06.414708 sshd_keygen[1472]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 15:57:06.438084 containerd[1468]: time="2025-02-13T15:57:06.436743730Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.447335347Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.447410167Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.447442913Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.447730434Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.447756140Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.447847667Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.447866830Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.448403934Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.448433705Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.448456477Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:57:06.450282 containerd[1468]: time="2025-02-13T15:57:06.448473670Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 15:57:06.451049 containerd[1468]: time="2025-02-13T15:57:06.448663866Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:57:06.451049 containerd[1468]: time="2025-02-13T15:57:06.449058314Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:57:06.451049 containerd[1468]: time="2025-02-13T15:57:06.450288829Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:57:06.451049 containerd[1468]: time="2025-02-13T15:57:06.450328502Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 15:57:06.451049 containerd[1468]: time="2025-02-13T15:57:06.450619970Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 15:57:06.451049 containerd[1468]: time="2025-02-13T15:57:06.450703326Z" level=info msg="metadata content store policy set" policy=shared Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.476229121Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.476361924Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.476389419Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.476420304Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.476444952Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.476894770Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477259811Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477454832Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477480455Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477513271Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477536316Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477560426Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477584254Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.484802 containerd[1468]: time="2025-02-13T15:57:06.477606260Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477627769Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477649112Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477669922Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477688270Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477714085Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477751873Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477786576Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477806908Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477819943Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477833664Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477847904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477861726Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477875485Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.485473 containerd[1468]: time="2025-02-13T15:57:06.477892258Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.477904848Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.477918164Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.477946164Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.477963118Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.477989906Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478010335Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478021981Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478075485Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478097539Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478110204Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478123460Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478134931Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478148330Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 15:57:06.486058 containerd[1468]: time="2025-02-13T15:57:06.478190832Z" level=info msg="NRI interface is disabled by configuration." Feb 13 15:57:06.486611 containerd[1468]: time="2025-02-13T15:57:06.478202258Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.481305094Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.481449446Z" level=info msg="Connect containerd service" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.481577305Z" level=info msg="using legacy CRI server" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.481593362Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.481827587Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.483718717Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.484546498Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.484703932Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.484809725Z" level=info msg="Start subscribing containerd event" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.484929246Z" level=info msg="Start recovering state" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.485061900Z" level=info msg="Start event monitor" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.485151808Z" level=info msg="Start snapshots syncer" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.485174101Z" level=info msg="Start cni network conf syncer for default" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.485186581Z" level=info msg="Start streaming server" Feb 13 15:57:06.486654 containerd[1468]: time="2025-02-13T15:57:06.485333951Z" level=info msg="containerd successfully booted in 0.164309s" Feb 13 15:57:06.487388 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 15:57:06.540184 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 15:57:06.556289 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 15:57:06.602648 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 15:57:06.603049 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 15:57:06.621405 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 15:57:06.672731 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 15:57:06.708268 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 15:57:06.719987 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 15:57:06.721092 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 15:57:07.906776 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:07.912607 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 15:57:07.917746 systemd[1]: Startup finished in 1.643s (kernel) + 6.559s (initrd) + 8.616s (userspace) = 16.819s. Feb 13 15:57:07.923874 (kubelet)[1557]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 15:57:07.957551 agetty[1551]: failed to open credentials directory Feb 13 15:57:07.970069 agetty[1550]: failed to open credentials directory Feb 13 15:57:08.739165 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 15:57:08.765486 systemd[1]: Started sshd@0-143.198.68.221:22-139.178.89.65:40372.service - OpenSSH per-connection server daemon (139.178.89.65:40372). Feb 13 15:57:09.058948 sshd[1568]: Accepted publickey for core from 139.178.89.65 port 40372 ssh2: RSA SHA256:xbQMFxKGhsFroWszVX4n07fPkTy8VMnJgGT8GFjL/e4 Feb 13 15:57:09.068263 sshd-session[1568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:09.102903 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 15:57:09.119099 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 15:57:09.133288 systemd-logind[1453]: New session 1 of user core. Feb 13 15:57:09.209311 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 15:57:09.238537 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 15:57:09.262835 (systemd)[1572]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 15:57:09.655056 systemd[1572]: Queued start job for default target default.target. Feb 13 15:57:09.669223 systemd[1572]: Created slice app.slice - User Application Slice. Feb 13 15:57:09.669307 systemd[1572]: Reached target paths.target - Paths. Feb 13 15:57:09.669334 systemd[1572]: Reached target timers.target - Timers. Feb 13 15:57:09.685400 systemd[1572]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 15:57:09.688363 kubelet[1557]: E0213 15:57:09.688300 1557 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 15:57:09.693927 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 15:57:09.694312 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 15:57:09.699872 systemd[1]: kubelet.service: Consumed 1.713s CPU time. Feb 13 15:57:09.729000 systemd[1572]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 15:57:09.730985 systemd[1572]: Reached target sockets.target - Sockets. Feb 13 15:57:09.731025 systemd[1572]: Reached target basic.target - Basic System. Feb 13 15:57:09.731172 systemd[1572]: Reached target default.target - Main User Target. Feb 13 15:57:09.731228 systemd[1572]: Startup finished in 447ms. Feb 13 15:57:09.731930 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 15:57:09.757521 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 15:57:09.851809 systemd[1]: Started sshd@1-143.198.68.221:22-139.178.89.65:40380.service - OpenSSH per-connection server daemon (139.178.89.65:40380). Feb 13 15:57:09.960886 sshd[1585]: Accepted publickey for core from 139.178.89.65 port 40380 ssh2: RSA SHA256:xbQMFxKGhsFroWszVX4n07fPkTy8VMnJgGT8GFjL/e4 Feb 13 15:57:09.967149 sshd-session[1585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:09.982426 systemd-logind[1453]: New session 2 of user core. Feb 13 15:57:09.995043 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 15:57:10.098207 sshd[1587]: Connection closed by 139.178.89.65 port 40380 Feb 13 15:57:10.099221 sshd-session[1585]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:10.109859 systemd[1]: sshd@1-143.198.68.221:22-139.178.89.65:40380.service: Deactivated successfully. Feb 13 15:57:10.115933 systemd[1]: session-2.scope: Deactivated successfully. Feb 13 15:57:10.117947 systemd-logind[1453]: Session 2 logged out. Waiting for processes to exit. Feb 13 15:57:10.134975 systemd[1]: Started sshd@2-143.198.68.221:22-139.178.89.65:40390.service - OpenSSH per-connection server daemon (139.178.89.65:40390). Feb 13 15:57:10.137835 systemd-logind[1453]: Removed session 2. Feb 13 15:57:10.238860 sshd[1592]: Accepted publickey for core from 139.178.89.65 port 40390 ssh2: RSA SHA256:xbQMFxKGhsFroWszVX4n07fPkTy8VMnJgGT8GFjL/e4 Feb 13 15:57:10.241807 sshd-session[1592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:10.250592 systemd-logind[1453]: New session 3 of user core. Feb 13 15:57:10.273251 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 15:57:10.350202 sshd[1594]: Connection closed by 139.178.89.65 port 40390 Feb 13 15:57:10.348192 sshd-session[1592]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:10.366529 systemd[1]: sshd@2-143.198.68.221:22-139.178.89.65:40390.service: Deactivated successfully. Feb 13 15:57:10.371419 systemd[1]: session-3.scope: Deactivated successfully. Feb 13 15:57:10.378124 systemd-logind[1453]: Session 3 logged out. Waiting for processes to exit. Feb 13 15:57:10.391775 systemd[1]: Started sshd@3-143.198.68.221:22-139.178.89.65:40394.service - OpenSSH per-connection server daemon (139.178.89.65:40394). Feb 13 15:57:10.394608 systemd-logind[1453]: Removed session 3. Feb 13 15:57:10.467409 sshd[1599]: Accepted publickey for core from 139.178.89.65 port 40394 ssh2: RSA SHA256:xbQMFxKGhsFroWszVX4n07fPkTy8VMnJgGT8GFjL/e4 Feb 13 15:57:10.470957 sshd-session[1599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:10.487385 systemd-logind[1453]: New session 4 of user core. Feb 13 15:57:10.494662 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 15:57:10.584815 sshd[1601]: Connection closed by 139.178.89.65 port 40394 Feb 13 15:57:10.588879 sshd-session[1599]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:10.600979 systemd[1]: sshd@3-143.198.68.221:22-139.178.89.65:40394.service: Deactivated successfully. Feb 13 15:57:10.606016 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 15:57:10.608084 systemd-logind[1453]: Session 4 logged out. Waiting for processes to exit. Feb 13 15:57:10.631891 systemd[1]: Started sshd@4-143.198.68.221:22-139.178.89.65:40410.service - OpenSSH per-connection server daemon (139.178.89.65:40410). Feb 13 15:57:10.634191 systemd-logind[1453]: Removed session 4. Feb 13 15:57:10.700712 sshd[1606]: Accepted publickey for core from 139.178.89.65 port 40410 ssh2: RSA SHA256:xbQMFxKGhsFroWszVX4n07fPkTy8VMnJgGT8GFjL/e4 Feb 13 15:57:10.703627 sshd-session[1606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:10.717645 systemd-logind[1453]: New session 5 of user core. Feb 13 15:57:10.727709 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 15:57:10.821371 sudo[1609]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 15:57:10.821895 sudo[1609]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:10.844594 sudo[1609]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:10.850016 sshd[1608]: Connection closed by 139.178.89.65 port 40410 Feb 13 15:57:10.851795 sshd-session[1606]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:10.869638 systemd[1]: sshd@4-143.198.68.221:22-139.178.89.65:40410.service: Deactivated successfully. Feb 13 15:57:10.874736 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 15:57:10.879820 systemd-logind[1453]: Session 5 logged out. Waiting for processes to exit. Feb 13 15:57:10.897636 systemd[1]: Started sshd@5-143.198.68.221:22-139.178.89.65:40426.service - OpenSSH per-connection server daemon (139.178.89.65:40426). Feb 13 15:57:10.899641 systemd-logind[1453]: Removed session 5. Feb 13 15:57:10.980839 sshd[1614]: Accepted publickey for core from 139.178.89.65 port 40426 ssh2: RSA SHA256:xbQMFxKGhsFroWszVX4n07fPkTy8VMnJgGT8GFjL/e4 Feb 13 15:57:10.983801 sshd-session[1614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:11.001893 systemd-logind[1453]: New session 6 of user core. Feb 13 15:57:11.011524 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 15:57:11.110635 sudo[1618]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 15:57:11.111073 sudo[1618]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:11.126049 sudo[1618]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:11.137597 sudo[1617]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 15:57:11.138225 sudo[1617]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:11.276235 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 15:57:11.344422 augenrules[1640]: No rules Feb 13 15:57:11.347933 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 15:57:11.348294 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 15:57:11.351395 sudo[1617]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:11.358809 sshd[1616]: Connection closed by 139.178.89.65 port 40426 Feb 13 15:57:11.361941 sshd-session[1614]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:11.383543 systemd[1]: sshd@5-143.198.68.221:22-139.178.89.65:40426.service: Deactivated successfully. Feb 13 15:57:11.387633 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 15:57:11.392698 systemd-logind[1453]: Session 6 logged out. Waiting for processes to exit. Feb 13 15:57:11.399853 systemd[1]: Started sshd@6-143.198.68.221:22-139.178.89.65:40428.service - OpenSSH per-connection server daemon (139.178.89.65:40428). Feb 13 15:57:11.403376 systemd-logind[1453]: Removed session 6. Feb 13 15:57:11.494182 sshd[1648]: Accepted publickey for core from 139.178.89.65 port 40428 ssh2: RSA SHA256:xbQMFxKGhsFroWszVX4n07fPkTy8VMnJgGT8GFjL/e4 Feb 13 15:57:11.498006 sshd-session[1648]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:11.508174 systemd-logind[1453]: New session 7 of user core. Feb 13 15:57:11.516755 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 15:57:11.604636 sudo[1651]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 15:57:11.605254 sudo[1651]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:13.071638 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:13.071888 systemd[1]: kubelet.service: Consumed 1.713s CPU time. Feb 13 15:57:13.084550 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:13.136350 systemd[1]: Reloading requested from client PID 1689 ('systemctl') (unit session-7.scope)... Feb 13 15:57:13.136783 systemd[1]: Reloading... Feb 13 15:57:13.454190 zram_generator::config[1729]: No configuration found. Feb 13 15:57:13.675395 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:57:13.795215 systemd[1]: Reloading finished in 657 ms. Feb 13 15:57:13.949930 (kubelet)[1772]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 15:57:13.953126 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:13.954071 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 15:57:13.954522 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:13.967848 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:14.253230 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:14.253407 (kubelet)[1782]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 15:57:14.383212 kubelet[1782]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 15:57:14.383212 kubelet[1782]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 15:57:14.383212 kubelet[1782]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 15:57:14.383212 kubelet[1782]: I0213 15:57:14.382172 1782 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 15:57:14.823109 kubelet[1782]: I0213 15:57:14.822989 1782 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Feb 13 15:57:14.823109 kubelet[1782]: I0213 15:57:14.823047 1782 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 15:57:14.823464 kubelet[1782]: I0213 15:57:14.823405 1782 server.go:927] "Client rotation is on, will bootstrap in background" Feb 13 15:57:14.859896 kubelet[1782]: I0213 15:57:14.852941 1782 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 15:57:14.883759 kubelet[1782]: I0213 15:57:14.882861 1782 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 15:57:14.886321 kubelet[1782]: I0213 15:57:14.885603 1782 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 15:57:14.886321 kubelet[1782]: I0213 15:57:14.885718 1782 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"143.198.68.221","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Feb 13 15:57:14.887268 kubelet[1782]: I0213 15:57:14.887221 1782 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 15:57:14.887439 kubelet[1782]: I0213 15:57:14.887424 1782 container_manager_linux.go:301] "Creating device plugin manager" Feb 13 15:57:14.887765 kubelet[1782]: I0213 15:57:14.887744 1782 state_mem.go:36] "Initialized new in-memory state store" Feb 13 15:57:14.889009 kubelet[1782]: I0213 15:57:14.888973 1782 kubelet.go:400] "Attempting to sync node with API server" Feb 13 15:57:14.889810 kubelet[1782]: I0213 15:57:14.889179 1782 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 15:57:14.889810 kubelet[1782]: I0213 15:57:14.889228 1782 kubelet.go:312] "Adding apiserver pod source" Feb 13 15:57:14.889810 kubelet[1782]: I0213 15:57:14.889257 1782 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 15:57:14.894435 kubelet[1782]: E0213 15:57:14.894272 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:14.894435 kubelet[1782]: E0213 15:57:14.894349 1782 file.go:98] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:14.896574 kubelet[1782]: I0213 15:57:14.896529 1782 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 15:57:14.899174 kubelet[1782]: I0213 15:57:14.899009 1782 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 15:57:14.900223 kubelet[1782]: W0213 15:57:14.899371 1782 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 15:57:14.900752 kubelet[1782]: I0213 15:57:14.900726 1782 server.go:1264] "Started kubelet" Feb 13 15:57:14.909828 kubelet[1782]: I0213 15:57:14.903474 1782 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 15:57:14.909828 kubelet[1782]: I0213 15:57:14.905204 1782 server.go:455] "Adding debug handlers to kubelet server" Feb 13 15:57:14.909828 kubelet[1782]: I0213 15:57:14.908980 1782 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 15:57:14.914371 kubelet[1782]: I0213 15:57:14.913779 1782 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 15:57:14.914799 kubelet[1782]: I0213 15:57:14.914684 1782 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 15:57:14.924037 kubelet[1782]: I0213 15:57:14.922531 1782 volume_manager.go:291] "Starting Kubelet Volume Manager" Feb 13 15:57:14.924037 kubelet[1782]: W0213 15:57:14.923473 1782 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 13 15:57:14.924037 kubelet[1782]: E0213 15:57:14.923522 1782 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 13 15:57:14.924037 kubelet[1782]: W0213 15:57:14.923597 1782 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "143.198.68.221" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 13 15:57:14.924037 kubelet[1782]: E0213 15:57:14.923614 1782 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes "143.198.68.221" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 13 15:57:14.924636 kubelet[1782]: I0213 15:57:14.924415 1782 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 15:57:14.924636 kubelet[1782]: I0213 15:57:14.924552 1782 reconciler.go:26] "Reconciler: start to sync state" Feb 13 15:57:14.928777 kubelet[1782]: I0213 15:57:14.928716 1782 factory.go:221] Registration of the systemd container factory successfully Feb 13 15:57:14.929423 kubelet[1782]: I0213 15:57:14.928902 1782 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 15:57:14.932198 kubelet[1782]: E0213 15:57:14.932063 1782 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 15:57:14.932364 kubelet[1782]: I0213 15:57:14.932328 1782 factory.go:221] Registration of the containerd container factory successfully Feb 13 15:57:14.953428 kubelet[1782]: E0213 15:57:14.953361 1782 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"143.198.68.221\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Feb 13 15:57:14.954036 kubelet[1782]: W0213 15:57:14.953985 1782 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 13 15:57:14.954036 kubelet[1782]: E0213 15:57:14.954025 1782 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 13 15:57:14.963080 kubelet[1782]: E0213 15:57:14.954779 1782 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{143.198.68.221.1823cfafc3764d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:143.198.68.221,UID:143.198.68.221,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:143.198.68.221,},FirstTimestamp:2025-02-13 15:57:14.900680088 +0000 UTC m=+0.636010892,LastTimestamp:2025-02-13 15:57:14.900680088 +0000 UTC m=+0.636010892,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:143.198.68.221,}" Feb 13 15:57:14.969215 kubelet[1782]: I0213 15:57:14.968703 1782 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 15:57:14.969215 kubelet[1782]: I0213 15:57:14.968727 1782 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 15:57:14.969215 kubelet[1782]: I0213 15:57:14.968751 1782 state_mem.go:36] "Initialized new in-memory state store" Feb 13 15:57:14.975927 kubelet[1782]: I0213 15:57:14.974855 1782 policy_none.go:49] "None policy: Start" Feb 13 15:57:14.976415 kubelet[1782]: I0213 15:57:14.976313 1782 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 15:57:14.976415 kubelet[1782]: I0213 15:57:14.976359 1782 state_mem.go:35] "Initializing new in-memory state store" Feb 13 15:57:15.003023 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 15:57:15.029501 kubelet[1782]: I0213 15:57:15.026810 1782 kubelet_node_status.go:73] "Attempting to register node" node="143.198.68.221" Feb 13 15:57:15.028816 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 15:57:15.040083 kubelet[1782]: I0213 15:57:15.039903 1782 kubelet_node_status.go:76] "Successfully registered node" node="143.198.68.221" Feb 13 15:57:15.053098 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 15:57:15.059909 kubelet[1782]: I0213 15:57:15.059724 1782 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 15:57:15.060191 kubelet[1782]: I0213 15:57:15.060096 1782 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 15:57:15.060376 kubelet[1782]: I0213 15:57:15.060346 1782 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 15:57:15.077917 kubelet[1782]: E0213 15:57:15.075507 1782 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"143.198.68.221\" not found" Feb 13 15:57:15.082187 kubelet[1782]: I0213 15:57:15.081979 1782 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 15:57:15.086221 kubelet[1782]: I0213 15:57:15.085813 1782 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 15:57:15.086221 kubelet[1782]: I0213 15:57:15.085877 1782 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 15:57:15.086221 kubelet[1782]: I0213 15:57:15.085934 1782 kubelet.go:2337] "Starting kubelet main sync loop" Feb 13 15:57:15.086221 kubelet[1782]: E0213 15:57:15.086179 1782 kubelet.go:2361] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 13 15:57:15.087044 kubelet[1782]: E0213 15:57:15.086991 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.187903 kubelet[1782]: E0213 15:57:15.187524 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.293425 kubelet[1782]: E0213 15:57:15.291181 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.408601 kubelet[1782]: E0213 15:57:15.392752 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.496810 kubelet[1782]: E0213 15:57:15.496690 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.546802 sudo[1651]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:15.561487 sshd[1650]: Connection closed by 139.178.89.65 port 40428 Feb 13 15:57:15.565575 sshd-session[1648]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:15.579103 systemd[1]: sshd@6-143.198.68.221:22-139.178.89.65:40428.service: Deactivated successfully. Feb 13 15:57:15.586550 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 15:57:15.597975 kubelet[1782]: E0213 15:57:15.597922 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.598411 systemd-logind[1453]: Session 7 logged out. Waiting for processes to exit. Feb 13 15:57:15.601241 systemd-logind[1453]: Removed session 7. Feb 13 15:57:15.701651 kubelet[1782]: E0213 15:57:15.701432 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.802967 kubelet[1782]: E0213 15:57:15.802812 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:15.829656 kubelet[1782]: I0213 15:57:15.827314 1782 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 13 15:57:15.829656 kubelet[1782]: W0213 15:57:15.829393 1782 reflector.go:470] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 13 15:57:15.829656 kubelet[1782]: W0213 15:57:15.829537 1782 reflector.go:470] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 13 15:57:15.895617 kubelet[1782]: E0213 15:57:15.895424 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:15.903284 kubelet[1782]: E0213 15:57:15.903089 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:16.004371 kubelet[1782]: E0213 15:57:16.004109 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:16.105150 kubelet[1782]: E0213 15:57:16.105025 1782 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"143.198.68.221\" not found" Feb 13 15:57:16.207234 kubelet[1782]: I0213 15:57:16.206851 1782 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.1.0/24" Feb 13 15:57:16.208997 containerd[1468]: time="2025-02-13T15:57:16.208545379Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 15:57:16.211205 kubelet[1782]: I0213 15:57:16.209582 1782 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.1.0/24" Feb 13 15:57:16.895865 kubelet[1782]: E0213 15:57:16.895744 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:16.898472 kubelet[1782]: I0213 15:57:16.897809 1782 apiserver.go:52] "Watching apiserver" Feb 13 15:57:16.911121 kubelet[1782]: I0213 15:57:16.910997 1782 topology_manager.go:215] "Topology Admit Handler" podUID="0fdf8ed7-cd24-492d-8ed4-2041dfabc288" podNamespace="kube-system" podName="kube-proxy-vkzzf" Feb 13 15:57:16.911367 kubelet[1782]: I0213 15:57:16.911269 1782 topology_manager.go:215] "Topology Admit Handler" podUID="900cfedd-57bb-4899-8255-b9e43c2da5e5" podNamespace="calico-system" podName="calico-node-d69xk" Feb 13 15:57:16.911450 kubelet[1782]: I0213 15:57:16.911437 1782 topology_manager.go:215] "Topology Admit Handler" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" podNamespace="calico-system" podName="csi-node-driver-h6pm6" Feb 13 15:57:16.913246 kubelet[1782]: E0213 15:57:16.911962 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:16.926640 kubelet[1782]: I0213 15:57:16.926591 1782 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 15:57:16.927625 systemd[1]: Created slice kubepods-besteffort-pod900cfedd_57bb_4899_8255_b9e43c2da5e5.slice - libcontainer container kubepods-besteffort-pod900cfedd_57bb_4899_8255_b9e43c2da5e5.slice. Feb 13 15:57:16.959037 kubelet[1782]: I0213 15:57:16.957932 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/900cfedd-57bb-4899-8255-b9e43c2da5e5-node-certs\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959037 kubelet[1782]: I0213 15:57:16.958016 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-run-calico\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959037 kubelet[1782]: I0213 15:57:16.958050 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-bin-dir\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959037 kubelet[1782]: I0213 15:57:16.958084 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/120c5850-254f-4135-9f32-a6b8fe647e54-kubelet-dir\") pod \"csi-node-driver-h6pm6\" (UID: \"120c5850-254f-4135-9f32-a6b8fe647e54\") " pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:16.959037 kubelet[1782]: I0213 15:57:16.958113 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fdf8ed7-cd24-492d-8ed4-2041dfabc288-lib-modules\") pod \"kube-proxy-vkzzf\" (UID: \"0fdf8ed7-cd24-492d-8ed4-2041dfabc288\") " pod="kube-system/kube-proxy-vkzzf" Feb 13 15:57:16.959512 kubelet[1782]: I0213 15:57:16.958179 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb4hg\" (UniqueName: \"kubernetes.io/projected/0fdf8ed7-cd24-492d-8ed4-2041dfabc288-kube-api-access-kb4hg\") pod \"kube-proxy-vkzzf\" (UID: \"0fdf8ed7-cd24-492d-8ed4-2041dfabc288\") " pod="kube-system/kube-proxy-vkzzf" Feb 13 15:57:16.959512 kubelet[1782]: I0213 15:57:16.958211 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-xtables-lock\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959512 kubelet[1782]: I0213 15:57:16.958237 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/120c5850-254f-4135-9f32-a6b8fe647e54-socket-dir\") pod \"csi-node-driver-h6pm6\" (UID: \"120c5850-254f-4135-9f32-a6b8fe647e54\") " pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:16.959512 kubelet[1782]: I0213 15:57:16.958270 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crqxd\" (UniqueName: \"kubernetes.io/projected/120c5850-254f-4135-9f32-a6b8fe647e54-kube-api-access-crqxd\") pod \"csi-node-driver-h6pm6\" (UID: \"120c5850-254f-4135-9f32-a6b8fe647e54\") " pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:16.959512 kubelet[1782]: I0213 15:57:16.958308 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-lib-modules\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959720 kubelet[1782]: I0213 15:57:16.958368 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/900cfedd-57bb-4899-8255-b9e43c2da5e5-tigera-ca-bundle\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959720 kubelet[1782]: I0213 15:57:16.958395 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-net-dir\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959720 kubelet[1782]: I0213 15:57:16.958422 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0fdf8ed7-cd24-492d-8ed4-2041dfabc288-xtables-lock\") pod \"kube-proxy-vkzzf\" (UID: \"0fdf8ed7-cd24-492d-8ed4-2041dfabc288\") " pod="kube-system/kube-proxy-vkzzf" Feb 13 15:57:16.959720 kubelet[1782]: I0213 15:57:16.958447 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-lib-calico\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959720 kubelet[1782]: I0213 15:57:16.958474 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-log-dir\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959920 kubelet[1782]: I0213 15:57:16.958503 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rd9n\" (UniqueName: \"kubernetes.io/projected/900cfedd-57bb-4899-8255-b9e43c2da5e5-kube-api-access-5rd9n\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.959920 kubelet[1782]: I0213 15:57:16.958543 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/120c5850-254f-4135-9f32-a6b8fe647e54-varrun\") pod \"csi-node-driver-h6pm6\" (UID: \"120c5850-254f-4135-9f32-a6b8fe647e54\") " pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:16.959920 kubelet[1782]: I0213 15:57:16.958575 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/120c5850-254f-4135-9f32-a6b8fe647e54-registration-dir\") pod \"csi-node-driver-h6pm6\" (UID: \"120c5850-254f-4135-9f32-a6b8fe647e54\") " pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:16.959920 kubelet[1782]: I0213 15:57:16.958664 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0fdf8ed7-cd24-492d-8ed4-2041dfabc288-kube-proxy\") pod \"kube-proxy-vkzzf\" (UID: \"0fdf8ed7-cd24-492d-8ed4-2041dfabc288\") " pod="kube-system/kube-proxy-vkzzf" Feb 13 15:57:16.959920 kubelet[1782]: I0213 15:57:16.958696 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-policysync\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.960103 kubelet[1782]: I0213 15:57:16.958749 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-flexvol-driver-host\") pod \"calico-node-d69xk\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " pod="calico-system/calico-node-d69xk" Feb 13 15:57:16.960931 systemd[1]: Created slice kubepods-besteffort-pod0fdf8ed7_cd24_492d_8ed4_2041dfabc288.slice - libcontainer container kubepods-besteffort-pod0fdf8ed7_cd24_492d_8ed4_2041dfabc288.slice. Feb 13 15:57:17.081414 kubelet[1782]: E0213 15:57:17.081345 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:17.081414 kubelet[1782]: W0213 15:57:17.081395 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:17.081615 kubelet[1782]: E0213 15:57:17.081445 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:17.083170 kubelet[1782]: E0213 15:57:17.082941 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:17.083170 kubelet[1782]: W0213 15:57:17.082967 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:17.083170 kubelet[1782]: E0213 15:57:17.082990 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:17.090929 kubelet[1782]: E0213 15:57:17.089202 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:17.090929 kubelet[1782]: W0213 15:57:17.090286 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:17.090929 kubelet[1782]: E0213 15:57:17.090345 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:17.119407 kubelet[1782]: E0213 15:57:17.118303 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:17.119407 kubelet[1782]: W0213 15:57:17.118340 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:17.119407 kubelet[1782]: E0213 15:57:17.118373 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:17.127234 kubelet[1782]: E0213 15:57:17.127168 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:17.127590 kubelet[1782]: W0213 15:57:17.127204 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:17.127590 kubelet[1782]: E0213 15:57:17.127472 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:17.136502 kubelet[1782]: E0213 15:57:17.135581 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:17.136502 kubelet[1782]: W0213 15:57:17.135620 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:17.136502 kubelet[1782]: E0213 15:57:17.135735 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:17.250469 kubelet[1782]: E0213 15:57:17.249456 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:17.250773 containerd[1468]: time="2025-02-13T15:57:17.250638662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-d69xk,Uid:900cfedd-57bb-4899-8255-b9e43c2da5e5,Namespace:calico-system,Attempt:0,}" Feb 13 15:57:17.269725 kubelet[1782]: E0213 15:57:17.268624 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:17.269966 containerd[1468]: time="2025-02-13T15:57:17.269802692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vkzzf,Uid:0fdf8ed7-cd24-492d-8ed4-2041dfabc288,Namespace:kube-system,Attempt:0,}" Feb 13 15:57:17.896737 kubelet[1782]: E0213 15:57:17.896539 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:17.994085 containerd[1468]: time="2025-02-13T15:57:17.993259999Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:17.997962 containerd[1468]: time="2025-02-13T15:57:17.997846503Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 15:57:18.010178 containerd[1468]: time="2025-02-13T15:57:18.008572465Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:18.012688 containerd[1468]: time="2025-02-13T15:57:18.012609153Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:18.013955 containerd[1468]: time="2025-02-13T15:57:18.013861062Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 15:57:18.019164 containerd[1468]: time="2025-02-13T15:57:18.018560438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:18.020378 containerd[1468]: time="2025-02-13T15:57:18.020312472Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 769.368134ms" Feb 13 15:57:18.022617 containerd[1468]: time="2025-02-13T15:57:18.022548115Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 752.563941ms" Feb 13 15:57:18.089727 kubelet[1782]: E0213 15:57:18.087186 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:18.094786 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1641838507.mount: Deactivated successfully. Feb 13 15:57:18.286345 containerd[1468]: time="2025-02-13T15:57:18.284271971Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:18.286345 containerd[1468]: time="2025-02-13T15:57:18.284361734Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:18.286345 containerd[1468]: time="2025-02-13T15:57:18.284381497Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:18.286345 containerd[1468]: time="2025-02-13T15:57:18.284501983Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:18.293758 containerd[1468]: time="2025-02-13T15:57:18.293435203Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:18.293758 containerd[1468]: time="2025-02-13T15:57:18.293523062Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:18.293758 containerd[1468]: time="2025-02-13T15:57:18.293541392Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:18.293758 containerd[1468]: time="2025-02-13T15:57:18.293677699Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:18.475569 systemd[1]: Started cri-containerd-35375a7ff3fa3f1958a14057395c6629f8b9114e1d8920bec0c2ba341979f04d.scope - libcontainer container 35375a7ff3fa3f1958a14057395c6629f8b9114e1d8920bec0c2ba341979f04d. Feb 13 15:57:18.481081 systemd[1]: Started cri-containerd-7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078.scope - libcontainer container 7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078. Feb 13 15:57:18.553282 containerd[1468]: time="2025-02-13T15:57:18.552973997Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vkzzf,Uid:0fdf8ed7-cd24-492d-8ed4-2041dfabc288,Namespace:kube-system,Attempt:0,} returns sandbox id \"35375a7ff3fa3f1958a14057395c6629f8b9114e1d8920bec0c2ba341979f04d\"" Feb 13 15:57:18.558507 containerd[1468]: time="2025-02-13T15:57:18.558424076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-d69xk,Uid:900cfedd-57bb-4899-8255-b9e43c2da5e5,Namespace:calico-system,Attempt:0,} returns sandbox id \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\"" Feb 13 15:57:18.559217 kubelet[1782]: E0213 15:57:18.559175 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:18.563587 kubelet[1782]: E0213 15:57:18.563539 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:18.564359 containerd[1468]: time="2025-02-13T15:57:18.564299549Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\"" Feb 13 15:57:18.897106 kubelet[1782]: E0213 15:57:18.896830 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:19.900662 kubelet[1782]: E0213 15:57:19.899954 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:20.087916 kubelet[1782]: E0213 15:57:20.087146 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:20.290891 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2418056165.mount: Deactivated successfully. Feb 13 15:57:20.901059 kubelet[1782]: E0213 15:57:20.900874 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:21.102670 containerd[1468]: time="2025-02-13T15:57:21.102519160Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:21.103890 containerd[1468]: time="2025-02-13T15:57:21.103682320Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.10: active requests=0, bytes read=29057858" Feb 13 15:57:21.104836 containerd[1468]: time="2025-02-13T15:57:21.104777162Z" level=info msg="ImageCreate event name:\"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:21.110573 containerd[1468]: time="2025-02-13T15:57:21.108950363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:21.110573 containerd[1468]: time="2025-02-13T15:57:21.110297574Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.10\" with image id \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\", repo tag \"registry.k8s.io/kube-proxy:v1.30.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\", size \"29056877\" in 2.545934202s" Feb 13 15:57:21.110573 containerd[1468]: time="2025-02-13T15:57:21.110360378Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\" returns image reference \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\"" Feb 13 15:57:21.115653 containerd[1468]: time="2025-02-13T15:57:21.115602412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 15:57:21.117094 containerd[1468]: time="2025-02-13T15:57:21.116814311Z" level=info msg="CreateContainer within sandbox \"35375a7ff3fa3f1958a14057395c6629f8b9114e1d8920bec0c2ba341979f04d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 15:57:21.152569 containerd[1468]: time="2025-02-13T15:57:21.151985971Z" level=info msg="CreateContainer within sandbox \"35375a7ff3fa3f1958a14057395c6629f8b9114e1d8920bec0c2ba341979f04d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ce48eacd585b4537d7c3c7afbb1d90e9b2ee35ce5dec9de040c0d5d260888b40\"" Feb 13 15:57:21.154275 containerd[1468]: time="2025-02-13T15:57:21.154141625Z" level=info msg="StartContainer for \"ce48eacd585b4537d7c3c7afbb1d90e9b2ee35ce5dec9de040c0d5d260888b40\"" Feb 13 15:57:21.211008 systemd[1]: Started cri-containerd-ce48eacd585b4537d7c3c7afbb1d90e9b2ee35ce5dec9de040c0d5d260888b40.scope - libcontainer container ce48eacd585b4537d7c3c7afbb1d90e9b2ee35ce5dec9de040c0d5d260888b40. Feb 13 15:57:21.276627 containerd[1468]: time="2025-02-13T15:57:21.276552142Z" level=info msg="StartContainer for \"ce48eacd585b4537d7c3c7afbb1d90e9b2ee35ce5dec9de040c0d5d260888b40\" returns successfully" Feb 13 15:57:21.903262 kubelet[1782]: E0213 15:57:21.901285 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:22.087040 kubelet[1782]: E0213 15:57:22.086908 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:22.143333 kubelet[1782]: E0213 15:57:22.143295 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:22.174345 kubelet[1782]: E0213 15:57:22.174010 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.174345 kubelet[1782]: W0213 15:57:22.174052 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.174345 kubelet[1782]: E0213 15:57:22.174082 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.175031 kubelet[1782]: E0213 15:57:22.174870 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.175031 kubelet[1782]: W0213 15:57:22.174890 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.175031 kubelet[1782]: E0213 15:57:22.174912 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.175538 kubelet[1782]: E0213 15:57:22.175510 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.175908 kubelet[1782]: W0213 15:57:22.175708 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.175908 kubelet[1782]: E0213 15:57:22.175742 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.176474 kubelet[1782]: E0213 15:57:22.176432 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.177013 kubelet[1782]: W0213 15:57:22.176795 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.177013 kubelet[1782]: E0213 15:57:22.176833 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.177492 kubelet[1782]: E0213 15:57:22.177303 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.177492 kubelet[1782]: W0213 15:57:22.177317 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.177492 kubelet[1782]: E0213 15:57:22.177336 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.177715 kubelet[1782]: E0213 15:57:22.177700 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.177792 kubelet[1782]: W0213 15:57:22.177780 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.177881 kubelet[1782]: E0213 15:57:22.177867 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.178613 kubelet[1782]: E0213 15:57:22.178442 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.178613 kubelet[1782]: W0213 15:57:22.178469 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.178613 kubelet[1782]: E0213 15:57:22.178489 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.178911 kubelet[1782]: E0213 15:57:22.178894 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.179895 kubelet[1782]: W0213 15:57:22.178986 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.179895 kubelet[1782]: E0213 15:57:22.179009 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.181772 kubelet[1782]: E0213 15:57:22.180476 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.181772 kubelet[1782]: W0213 15:57:22.180499 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.181772 kubelet[1782]: E0213 15:57:22.180516 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.182473 kubelet[1782]: E0213 15:57:22.182285 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.182473 kubelet[1782]: W0213 15:57:22.182309 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.182473 kubelet[1782]: E0213 15:57:22.182337 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.183026 kubelet[1782]: E0213 15:57:22.182917 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.183026 kubelet[1782]: W0213 15:57:22.182931 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.183026 kubelet[1782]: E0213 15:57:22.182945 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.183594 kubelet[1782]: E0213 15:57:22.183267 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.183594 kubelet[1782]: W0213 15:57:22.183277 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.183594 kubelet[1782]: E0213 15:57:22.183288 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.184276 kubelet[1782]: E0213 15:57:22.184100 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.184276 kubelet[1782]: W0213 15:57:22.184118 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.184276 kubelet[1782]: E0213 15:57:22.184174 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.184990 kubelet[1782]: E0213 15:57:22.184793 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.184990 kubelet[1782]: W0213 15:57:22.184817 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.184990 kubelet[1782]: E0213 15:57:22.184833 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.185303 kubelet[1782]: E0213 15:57:22.185172 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.185303 kubelet[1782]: W0213 15:57:22.185182 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.185303 kubelet[1782]: E0213 15:57:22.185196 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.185967 kubelet[1782]: E0213 15:57:22.185485 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.185967 kubelet[1782]: W0213 15:57:22.185495 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.185967 kubelet[1782]: E0213 15:57:22.185512 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.186549 kubelet[1782]: E0213 15:57:22.186375 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.186549 kubelet[1782]: W0213 15:57:22.186393 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.186549 kubelet[1782]: E0213 15:57:22.186409 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.187328 kubelet[1782]: E0213 15:57:22.187090 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.187328 kubelet[1782]: W0213 15:57:22.187200 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.187328 kubelet[1782]: E0213 15:57:22.187219 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.187981 kubelet[1782]: E0213 15:57:22.187960 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.188060 kubelet[1782]: W0213 15:57:22.188045 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.188505 kubelet[1782]: E0213 15:57:22.188337 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.188780 kubelet[1782]: E0213 15:57:22.188706 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.188780 kubelet[1782]: W0213 15:57:22.188719 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.188780 kubelet[1782]: E0213 15:57:22.188731 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.242970 kubelet[1782]: E0213 15:57:22.238476 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.242970 kubelet[1782]: W0213 15:57:22.238518 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.242970 kubelet[1782]: E0213 15:57:22.238551 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.242970 kubelet[1782]: E0213 15:57:22.238931 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.242970 kubelet[1782]: W0213 15:57:22.238947 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.242970 kubelet[1782]: E0213 15:57:22.238967 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.242970 kubelet[1782]: E0213 15:57:22.239272 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.242970 kubelet[1782]: W0213 15:57:22.239286 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.242970 kubelet[1782]: E0213 15:57:22.239303 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.242970 kubelet[1782]: E0213 15:57:22.239522 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.243741 kubelet[1782]: W0213 15:57:22.239533 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.243741 kubelet[1782]: E0213 15:57:22.239547 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.243741 kubelet[1782]: E0213 15:57:22.239745 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.243741 kubelet[1782]: W0213 15:57:22.239756 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.243741 kubelet[1782]: E0213 15:57:22.239769 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.243741 kubelet[1782]: E0213 15:57:22.240015 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.243741 kubelet[1782]: W0213 15:57:22.240027 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.243741 kubelet[1782]: E0213 15:57:22.240039 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.243741 kubelet[1782]: E0213 15:57:22.240586 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.243741 kubelet[1782]: W0213 15:57:22.240600 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.244030 kubelet[1782]: E0213 15:57:22.240615 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.244030 kubelet[1782]: E0213 15:57:22.240860 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.244030 kubelet[1782]: W0213 15:57:22.240872 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.244030 kubelet[1782]: E0213 15:57:22.240885 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.244030 kubelet[1782]: E0213 15:57:22.241093 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.244030 kubelet[1782]: W0213 15:57:22.241103 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.244030 kubelet[1782]: E0213 15:57:22.241115 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.244030 kubelet[1782]: E0213 15:57:22.241326 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.244030 kubelet[1782]: W0213 15:57:22.241336 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.244030 kubelet[1782]: E0213 15:57:22.241348 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.245335 kubelet[1782]: E0213 15:57:22.241574 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.245335 kubelet[1782]: W0213 15:57:22.241585 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.245335 kubelet[1782]: E0213 15:57:22.241598 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.245335 kubelet[1782]: E0213 15:57:22.242153 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:22.245335 kubelet[1782]: W0213 15:57:22.242166 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:22.245335 kubelet[1782]: E0213 15:57:22.242183 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:22.901745 kubelet[1782]: E0213 15:57:22.901625 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:23.147143 kubelet[1782]: E0213 15:57:23.147020 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:23.191660 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1712185342.mount: Deactivated successfully. Feb 13 15:57:23.200238 kubelet[1782]: E0213 15:57:23.199968 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.200238 kubelet[1782]: W0213 15:57:23.200023 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.200238 kubelet[1782]: E0213 15:57:23.200159 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.201298 kubelet[1782]: E0213 15:57:23.200562 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.201298 kubelet[1782]: W0213 15:57:23.200578 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.201298 kubelet[1782]: E0213 15:57:23.200593 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.201298 kubelet[1782]: E0213 15:57:23.200833 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.201298 kubelet[1782]: W0213 15:57:23.200844 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.201298 kubelet[1782]: E0213 15:57:23.200858 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.201298 kubelet[1782]: E0213 15:57:23.201194 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.201298 kubelet[1782]: W0213 15:57:23.201205 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.201298 kubelet[1782]: E0213 15:57:23.201222 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.202932 kubelet[1782]: E0213 15:57:23.202741 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.202932 kubelet[1782]: W0213 15:57:23.202770 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.202932 kubelet[1782]: E0213 15:57:23.202794 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.203338 kubelet[1782]: E0213 15:57:23.203072 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.203338 kubelet[1782]: W0213 15:57:23.203091 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.203338 kubelet[1782]: E0213 15:57:23.203105 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.203602 kubelet[1782]: E0213 15:57:23.203589 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.203745 kubelet[1782]: W0213 15:57:23.203640 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.203745 kubelet[1782]: E0213 15:57:23.203654 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.204097 kubelet[1782]: E0213 15:57:23.204019 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.204097 kubelet[1782]: W0213 15:57:23.204030 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.204097 kubelet[1782]: E0213 15:57:23.204042 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.204577 kubelet[1782]: E0213 15:57:23.204475 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.204577 kubelet[1782]: W0213 15:57:23.204488 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.204577 kubelet[1782]: E0213 15:57:23.204500 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.204868 kubelet[1782]: E0213 15:57:23.204709 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.204868 kubelet[1782]: W0213 15:57:23.204718 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.204868 kubelet[1782]: E0213 15:57:23.204730 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.205185 kubelet[1782]: E0213 15:57:23.205172 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.205340 kubelet[1782]: W0213 15:57:23.205250 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.205340 kubelet[1782]: E0213 15:57:23.205270 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.206171 kubelet[1782]: E0213 15:57:23.205964 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.206171 kubelet[1782]: W0213 15:57:23.205987 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.206171 kubelet[1782]: E0213 15:57:23.206002 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.207908 kubelet[1782]: E0213 15:57:23.207680 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.207908 kubelet[1782]: W0213 15:57:23.207707 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.207908 kubelet[1782]: E0213 15:57:23.207730 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.208563 kubelet[1782]: E0213 15:57:23.208542 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.208778 kubelet[1782]: W0213 15:57:23.208647 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.208778 kubelet[1782]: E0213 15:57:23.208668 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.209330 kubelet[1782]: E0213 15:57:23.209313 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.209518 kubelet[1782]: W0213 15:57:23.209420 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.209518 kubelet[1782]: E0213 15:57:23.209444 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.210000 kubelet[1782]: E0213 15:57:23.209905 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.210000 kubelet[1782]: W0213 15:57:23.209924 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.210000 kubelet[1782]: E0213 15:57:23.209944 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.210870 kubelet[1782]: E0213 15:57:23.210853 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.211762 kubelet[1782]: W0213 15:57:23.210940 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.211762 kubelet[1782]: E0213 15:57:23.211510 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.213143 kubelet[1782]: E0213 15:57:23.212841 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.213143 kubelet[1782]: W0213 15:57:23.212981 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.213143 kubelet[1782]: E0213 15:57:23.213004 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.213561 kubelet[1782]: E0213 15:57:23.213475 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.213561 kubelet[1782]: W0213 15:57:23.213527 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.213561 kubelet[1782]: E0213 15:57:23.213548 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.214244 kubelet[1782]: E0213 15:57:23.214076 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.214244 kubelet[1782]: W0213 15:57:23.214109 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.214244 kubelet[1782]: E0213 15:57:23.214217 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.248585 kubelet[1782]: E0213 15:57:23.248338 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.248585 kubelet[1782]: W0213 15:57:23.248379 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.248585 kubelet[1782]: E0213 15:57:23.248408 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.249259 kubelet[1782]: E0213 15:57:23.249106 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.249259 kubelet[1782]: W0213 15:57:23.249147 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.249259 kubelet[1782]: E0213 15:57:23.249174 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.249919 kubelet[1782]: E0213 15:57:23.249880 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.249919 kubelet[1782]: W0213 15:57:23.249910 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.250017 kubelet[1782]: E0213 15:57:23.249939 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.250546 kubelet[1782]: E0213 15:57:23.250507 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.250546 kubelet[1782]: W0213 15:57:23.250530 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.250781 kubelet[1782]: E0213 15:57:23.250638 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.253722 kubelet[1782]: E0213 15:57:23.252647 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.253722 kubelet[1782]: W0213 15:57:23.252682 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.255359 kubelet[1782]: E0213 15:57:23.255311 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.255479 kubelet[1782]: W0213 15:57:23.255371 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.255479 kubelet[1782]: E0213 15:57:23.255402 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.257090 kubelet[1782]: E0213 15:57:23.257054 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.257090 kubelet[1782]: W0213 15:57:23.257085 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.257315 kubelet[1782]: E0213 15:57:23.257164 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.258460 kubelet[1782]: E0213 15:57:23.258426 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.258460 kubelet[1782]: W0213 15:57:23.258456 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.258605 kubelet[1782]: E0213 15:57:23.258488 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.259153 kubelet[1782]: E0213 15:57:23.258848 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.259153 kubelet[1782]: W0213 15:57:23.258869 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.259153 kubelet[1782]: E0213 15:57:23.258896 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.260286 kubelet[1782]: E0213 15:57:23.260260 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.260286 kubelet[1782]: W0213 15:57:23.260284 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.260392 kubelet[1782]: E0213 15:57:23.260303 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.263357 kubelet[1782]: E0213 15:57:23.263312 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.266421 kubelet[1782]: E0213 15:57:23.266374 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.266421 kubelet[1782]: W0213 15:57:23.266412 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.266565 kubelet[1782]: E0213 15:57:23.266443 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.267664 kubelet[1782]: E0213 15:57:23.267638 1782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:23.267881 kubelet[1782]: W0213 15:57:23.267812 1782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:23.267881 kubelet[1782]: E0213 15:57:23.267839 1782 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:23.409348 containerd[1468]: time="2025-02-13T15:57:23.408156209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:23.411936 containerd[1468]: time="2025-02-13T15:57:23.411808860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=6855343" Feb 13 15:57:23.413291 containerd[1468]: time="2025-02-13T15:57:23.413237317Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:23.417906 containerd[1468]: time="2025-02-13T15:57:23.417840861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:23.418882 containerd[1468]: time="2025-02-13T15:57:23.418821467Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 2.303167326s" Feb 13 15:57:23.418882 containerd[1468]: time="2025-02-13T15:57:23.418879833Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 15:57:23.423094 containerd[1468]: time="2025-02-13T15:57:23.423021362Z" level=info msg="CreateContainer within sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 15:57:23.425582 systemd-resolved[1326]: Using degraded feature set UDP instead of UDP+EDNS0 for DNS server 67.207.67.3. Feb 13 15:57:23.451737 containerd[1468]: time="2025-02-13T15:57:23.451406213Z" level=info msg="CreateContainer within sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\"" Feb 13 15:57:23.454514 containerd[1468]: time="2025-02-13T15:57:23.452445360Z" level=info msg="StartContainer for \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\"" Feb 13 15:57:23.524987 systemd[1]: Started cri-containerd-7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf.scope - libcontainer container 7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf. Feb 13 15:57:23.591855 containerd[1468]: time="2025-02-13T15:57:23.591667125Z" level=info msg="StartContainer for \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\" returns successfully" Feb 13 15:57:23.606728 systemd[1]: cri-containerd-7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf.scope: Deactivated successfully. Feb 13 15:57:23.782616 containerd[1468]: time="2025-02-13T15:57:23.781958798Z" level=info msg="shim disconnected" id=7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf namespace=k8s.io Feb 13 15:57:23.782616 containerd[1468]: time="2025-02-13T15:57:23.782032790Z" level=warning msg="cleaning up after shim disconnected" id=7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf namespace=k8s.io Feb 13 15:57:23.782616 containerd[1468]: time="2025-02-13T15:57:23.782057540Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:57:23.902630 kubelet[1782]: E0213 15:57:23.902543 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:24.086733 kubelet[1782]: E0213 15:57:24.086375 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:24.105084 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf-rootfs.mount: Deactivated successfully. Feb 13 15:57:24.151818 kubelet[1782]: E0213 15:57:24.151092 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:24.153043 containerd[1468]: time="2025-02-13T15:57:24.152978396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 15:57:24.226715 kubelet[1782]: I0213 15:57:24.226592 1782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vkzzf" podStartSLOduration=6.677083928 podStartE2EDuration="9.226543606s" podCreationTimestamp="2025-02-13 15:57:15 +0000 UTC" firstStartedPulling="2025-02-13 15:57:18.562785708 +0000 UTC m=+4.298116474" lastFinishedPulling="2025-02-13 15:57:21.112245369 +0000 UTC m=+6.847576152" observedRunningTime="2025-02-13 15:57:22.188467478 +0000 UTC m=+7.923798280" watchObservedRunningTime="2025-02-13 15:57:24.226543606 +0000 UTC m=+9.961874400" Feb 13 15:57:24.903524 kubelet[1782]: E0213 15:57:24.903404 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:25.909611 kubelet[1782]: E0213 15:57:25.909556 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:26.087283 kubelet[1782]: E0213 15:57:26.086618 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:26.910840 kubelet[1782]: E0213 15:57:26.910768 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:27.911441 kubelet[1782]: E0213 15:57:27.911355 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:28.087709 kubelet[1782]: E0213 15:57:28.086886 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:28.913169 kubelet[1782]: E0213 15:57:28.913040 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:29.914267 kubelet[1782]: E0213 15:57:29.914175 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:29.942873 containerd[1468]: time="2025-02-13T15:57:29.942698976Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:29.947171 containerd[1468]: time="2025-02-13T15:57:29.947034929Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 15:57:29.949773 containerd[1468]: time="2025-02-13T15:57:29.949224338Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:29.953138 containerd[1468]: time="2025-02-13T15:57:29.953018100Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:29.960263 containerd[1468]: time="2025-02-13T15:57:29.957208730Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 5.804164158s" Feb 13 15:57:29.961167 containerd[1468]: time="2025-02-13T15:57:29.960330032Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 15:57:29.964572 containerd[1468]: time="2025-02-13T15:57:29.964307642Z" level=info msg="CreateContainer within sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 15:57:30.019569 containerd[1468]: time="2025-02-13T15:57:30.018433766Z" level=info msg="CreateContainer within sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\"" Feb 13 15:57:30.022160 containerd[1468]: time="2025-02-13T15:57:30.020488179Z" level=info msg="StartContainer for \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\"" Feb 13 15:57:30.088042 kubelet[1782]: E0213 15:57:30.087958 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:30.128647 systemd[1]: Started cri-containerd-f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e.scope - libcontainer container f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e. Feb 13 15:57:30.241005 containerd[1468]: time="2025-02-13T15:57:30.240674609Z" level=info msg="StartContainer for \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\" returns successfully" Feb 13 15:57:30.917030 kubelet[1782]: E0213 15:57:30.915426 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:31.205555 kubelet[1782]: E0213 15:57:31.204073 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:31.652933 containerd[1468]: time="2025-02-13T15:57:31.652322858Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 15:57:31.659658 systemd[1]: cri-containerd-f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e.scope: Deactivated successfully. Feb 13 15:57:31.720823 kubelet[1782]: I0213 15:57:31.720076 1782 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Feb 13 15:57:31.753978 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e-rootfs.mount: Deactivated successfully. Feb 13 15:57:31.958955 kubelet[1782]: E0213 15:57:31.958840 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:31.980162 containerd[1468]: time="2025-02-13T15:57:31.979769985Z" level=info msg="shim disconnected" id=f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e namespace=k8s.io Feb 13 15:57:31.980162 containerd[1468]: time="2025-02-13T15:57:31.979883495Z" level=warning msg="cleaning up after shim disconnected" id=f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e namespace=k8s.io Feb 13 15:57:31.980162 containerd[1468]: time="2025-02-13T15:57:31.979898693Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:57:32.096752 systemd[1]: Created slice kubepods-besteffort-pod120c5850_254f_4135_9f32_a6b8fe647e54.slice - libcontainer container kubepods-besteffort-pod120c5850_254f_4135_9f32_a6b8fe647e54.slice. Feb 13 15:57:32.105905 containerd[1468]: time="2025-02-13T15:57:32.105231067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:0,}" Feb 13 15:57:32.213401 kubelet[1782]: E0213 15:57:32.212734 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:32.214946 containerd[1468]: time="2025-02-13T15:57:32.214591729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 15:57:32.216768 systemd-resolved[1326]: Using degraded feature set TCP instead of UDP for DNS server 67.207.67.3. Feb 13 15:57:32.273015 containerd[1468]: time="2025-02-13T15:57:32.272915538Z" level=error msg="Failed to destroy network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:32.273803 containerd[1468]: time="2025-02-13T15:57:32.273742979Z" level=error msg="encountered an error cleaning up failed sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:32.274098 containerd[1468]: time="2025-02-13T15:57:32.274055038Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:32.277153 kubelet[1782]: E0213 15:57:32.274902 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:32.277153 kubelet[1782]: E0213 15:57:32.275009 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:32.277153 kubelet[1782]: E0213 15:57:32.275049 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:32.277392 kubelet[1782]: E0213 15:57:32.275110 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:32.277981 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7-shm.mount: Deactivated successfully. Feb 13 15:57:32.959478 kubelet[1782]: E0213 15:57:32.959372 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:33.223845 kubelet[1782]: I0213 15:57:33.219472 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7" Feb 13 15:57:33.225695 containerd[1468]: time="2025-02-13T15:57:33.225599529Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:33.226295 containerd[1468]: time="2025-02-13T15:57:33.226002117Z" level=info msg="Ensure that sandbox 91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7 in task-service has been cleanup successfully" Feb 13 15:57:33.228291 containerd[1468]: time="2025-02-13T15:57:33.226349107Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:33.228291 containerd[1468]: time="2025-02-13T15:57:33.226388559Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:33.229215 containerd[1468]: time="2025-02-13T15:57:33.229162853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:1,}" Feb 13 15:57:33.229796 systemd[1]: run-netns-cni\x2d27c8fe79\x2dc138\x2d4c33\x2d3ea5\x2d33edf6bf44e7.mount: Deactivated successfully. Feb 13 15:57:33.385975 containerd[1468]: time="2025-02-13T15:57:33.385036113Z" level=error msg="Failed to destroy network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:33.389697 containerd[1468]: time="2025-02-13T15:57:33.389581842Z" level=error msg="encountered an error cleaning up failed sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:33.389877 containerd[1468]: time="2025-02-13T15:57:33.389750877Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:33.390522 kubelet[1782]: E0213 15:57:33.390463 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:33.390832 kubelet[1782]: E0213 15:57:33.390755 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:33.390832 kubelet[1782]: E0213 15:57:33.390809 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:33.392020 kubelet[1782]: E0213 15:57:33.390883 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:33.391008 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf-shm.mount: Deactivated successfully. Feb 13 15:57:33.968259 kubelet[1782]: E0213 15:57:33.968154 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:34.224291 kubelet[1782]: I0213 15:57:34.223941 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf" Feb 13 15:57:34.228660 containerd[1468]: time="2025-02-13T15:57:34.228593636Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:34.230048 containerd[1468]: time="2025-02-13T15:57:34.229629656Z" level=info msg="Ensure that sandbox 93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf in task-service has been cleanup successfully" Feb 13 15:57:34.233137 containerd[1468]: time="2025-02-13T15:57:34.230264871Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:34.233137 containerd[1468]: time="2025-02-13T15:57:34.230302998Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:34.235361 containerd[1468]: time="2025-02-13T15:57:34.233661127Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:34.234704 systemd[1]: run-netns-cni\x2dc2638518\x2db2b7\x2d7ce2\x2d61b3\x2df9f149c72637.mount: Deactivated successfully. Feb 13 15:57:34.238072 containerd[1468]: time="2025-02-13T15:57:34.236224166Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:34.238072 containerd[1468]: time="2025-02-13T15:57:34.236271022Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:34.239934 containerd[1468]: time="2025-02-13T15:57:34.238627153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:2,}" Feb 13 15:57:34.480710 containerd[1468]: time="2025-02-13T15:57:34.477272934Z" level=error msg="Failed to destroy network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:34.484834 containerd[1468]: time="2025-02-13T15:57:34.484544367Z" level=error msg="encountered an error cleaning up failed sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:34.484834 containerd[1468]: time="2025-02-13T15:57:34.484674574Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:34.487229 kubelet[1782]: E0213 15:57:34.485827 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:34.487229 kubelet[1782]: E0213 15:57:34.485922 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:34.487229 kubelet[1782]: E0213 15:57:34.485963 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:34.486607 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb-shm.mount: Deactivated successfully. Feb 13 15:57:34.487841 kubelet[1782]: E0213 15:57:34.486027 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:34.890513 kubelet[1782]: E0213 15:57:34.890304 1782 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:34.969326 kubelet[1782]: E0213 15:57:34.969246 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:35.267430 kubelet[1782]: I0213 15:57:35.265376 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb" Feb 13 15:57:35.274066 containerd[1468]: time="2025-02-13T15:57:35.268662117Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:35.274066 containerd[1468]: time="2025-02-13T15:57:35.269005729Z" level=info msg="Ensure that sandbox 195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb in task-service has been cleanup successfully" Feb 13 15:57:35.280565 systemd[1]: run-netns-cni\x2df2d64d00\x2ddb1f\x2dd5b2\x2dd896\x2d478b19b46ac9.mount: Deactivated successfully. Feb 13 15:57:35.287295 containerd[1468]: time="2025-02-13T15:57:35.281338803Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:35.287295 containerd[1468]: time="2025-02-13T15:57:35.281395003Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:35.289916 containerd[1468]: time="2025-02-13T15:57:35.288617722Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:35.289916 containerd[1468]: time="2025-02-13T15:57:35.288791869Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:35.289916 containerd[1468]: time="2025-02-13T15:57:35.288810624Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:35.292595 containerd[1468]: time="2025-02-13T15:57:35.292286338Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:35.292595 containerd[1468]: time="2025-02-13T15:57:35.292488132Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:35.292595 containerd[1468]: time="2025-02-13T15:57:35.292524826Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:35.294599 containerd[1468]: time="2025-02-13T15:57:35.294463958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:3,}" Feb 13 15:57:35.699821 kubelet[1782]: I0213 15:57:35.699024 1782 topology_manager.go:215] "Topology Admit Handler" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" podNamespace="default" podName="nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:35.736369 systemd[1]: Created slice kubepods-besteffort-podf5de6aa9_e1ad_40fa_aec5_3d5311ff3b5d.slice - libcontainer container kubepods-besteffort-podf5de6aa9_e1ad_40fa_aec5_3d5311ff3b5d.slice. Feb 13 15:57:35.801426 kubelet[1782]: I0213 15:57:35.801344 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl5pp\" (UniqueName: \"kubernetes.io/projected/f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d-kube-api-access-gl5pp\") pod \"nginx-deployment-85f456d6dd-b2htq\" (UID: \"f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d\") " pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:35.822389 containerd[1468]: time="2025-02-13T15:57:35.822311249Z" level=error msg="Failed to destroy network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:35.824182 containerd[1468]: time="2025-02-13T15:57:35.823517254Z" level=error msg="encountered an error cleaning up failed sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:35.827299 containerd[1468]: time="2025-02-13T15:57:35.825261431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:35.825823 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2-shm.mount: Deactivated successfully. Feb 13 15:57:35.828935 kubelet[1782]: E0213 15:57:35.828739 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:35.830048 kubelet[1782]: E0213 15:57:35.828987 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:35.830048 kubelet[1782]: E0213 15:57:35.829222 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:35.830048 kubelet[1782]: E0213 15:57:35.829320 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:35.970144 kubelet[1782]: E0213 15:57:35.969977 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:36.057650 containerd[1468]: time="2025-02-13T15:57:36.057592032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:0,}" Feb 13 15:57:36.303159 kubelet[1782]: I0213 15:57:36.300022 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2" Feb 13 15:57:36.303325 containerd[1468]: time="2025-02-13T15:57:36.301736468Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:36.303325 containerd[1468]: time="2025-02-13T15:57:36.302057667Z" level=info msg="Ensure that sandbox 2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2 in task-service has been cleanup successfully" Feb 13 15:57:36.303325 containerd[1468]: time="2025-02-13T15:57:36.303171991Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:36.303325 containerd[1468]: time="2025-02-13T15:57:36.303209419Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:36.308537 systemd[1]: run-netns-cni\x2d03c0923d\x2d1325\x2d08ed\x2d8c9e\x2dce1104c4e60d.mount: Deactivated successfully. Feb 13 15:57:36.314315 containerd[1468]: time="2025-02-13T15:57:36.313324654Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:36.314315 containerd[1468]: time="2025-02-13T15:57:36.313476627Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:36.314315 containerd[1468]: time="2025-02-13T15:57:36.313696661Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:36.317619 containerd[1468]: time="2025-02-13T15:57:36.314827302Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:36.317619 containerd[1468]: time="2025-02-13T15:57:36.314980995Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:36.317619 containerd[1468]: time="2025-02-13T15:57:36.314999007Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:36.317619 containerd[1468]: time="2025-02-13T15:57:36.315804161Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:36.317619 containerd[1468]: time="2025-02-13T15:57:36.315926402Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:36.317619 containerd[1468]: time="2025-02-13T15:57:36.315943826Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:36.317619 containerd[1468]: time="2025-02-13T15:57:36.316915626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:4,}" Feb 13 15:57:36.434237 containerd[1468]: time="2025-02-13T15:57:36.431264444Z" level=error msg="Failed to destroy network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.434237 containerd[1468]: time="2025-02-13T15:57:36.432729186Z" level=error msg="encountered an error cleaning up failed sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.434237 containerd[1468]: time="2025-02-13T15:57:36.432848056Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:0,} failed, error" error="failed to setup network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.434845 kubelet[1782]: E0213 15:57:36.434366 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.434845 kubelet[1782]: E0213 15:57:36.434457 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:36.434845 kubelet[1782]: E0213 15:57:36.434491 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:36.435001 kubelet[1782]: E0213 15:57:36.434555 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:36.436400 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610-shm.mount: Deactivated successfully. Feb 13 15:57:36.578324 containerd[1468]: time="2025-02-13T15:57:36.578065164Z" level=error msg="Failed to destroy network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.584632 containerd[1468]: time="2025-02-13T15:57:36.582467082Z" level=error msg="encountered an error cleaning up failed sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.584632 containerd[1468]: time="2025-02-13T15:57:36.582590283Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.584057 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0-shm.mount: Deactivated successfully. Feb 13 15:57:36.587877 kubelet[1782]: E0213 15:57:36.586311 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:36.587877 kubelet[1782]: E0213 15:57:36.586400 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:36.587877 kubelet[1782]: E0213 15:57:36.586732 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:36.588830 kubelet[1782]: E0213 15:57:36.586829 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:36.971136 kubelet[1782]: E0213 15:57:36.971042 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:37.325389 kubelet[1782]: I0213 15:57:37.323573 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0" Feb 13 15:57:37.332173 containerd[1468]: time="2025-02-13T15:57:37.326556265Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:37.332173 containerd[1468]: time="2025-02-13T15:57:37.326854879Z" level=info msg="Ensure that sandbox 89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0 in task-service has been cleanup successfully" Feb 13 15:57:37.337996 containerd[1468]: time="2025-02-13T15:57:37.334021475Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:37.337996 containerd[1468]: time="2025-02-13T15:57:37.337543293Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:37.335171 systemd[1]: run-netns-cni\x2d55561c92\x2d65db\x2de292\x2da92e\x2d2803174b6ec6.mount: Deactivated successfully. Feb 13 15:57:37.343061 containerd[1468]: time="2025-02-13T15:57:37.341235665Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:37.343061 containerd[1468]: time="2025-02-13T15:57:37.341402132Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:37.343061 containerd[1468]: time="2025-02-13T15:57:37.341424858Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:37.344713 containerd[1468]: time="2025-02-13T15:57:37.343657238Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:37.344713 containerd[1468]: time="2025-02-13T15:57:37.343817885Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:37.344713 containerd[1468]: time="2025-02-13T15:57:37.343837356Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:37.344963 kubelet[1782]: I0213 15:57:37.344452 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610" Feb 13 15:57:37.346508 containerd[1468]: time="2025-02-13T15:57:37.345528238Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:37.346508 containerd[1468]: time="2025-02-13T15:57:37.345733379Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:37.346508 containerd[1468]: time="2025-02-13T15:57:37.345753787Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:37.348998 containerd[1468]: time="2025-02-13T15:57:37.348938979Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:37.349475 containerd[1468]: time="2025-02-13T15:57:37.349426236Z" level=info msg="Ensure that sandbox 46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610 in task-service has been cleanup successfully" Feb 13 15:57:37.350503 containerd[1468]: time="2025-02-13T15:57:37.348968487Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:37.352789 containerd[1468]: time="2025-02-13T15:57:37.352581517Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:37.352789 containerd[1468]: time="2025-02-13T15:57:37.352619782Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:37.352789 containerd[1468]: time="2025-02-13T15:57:37.352274834Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:37.352789 containerd[1468]: time="2025-02-13T15:57:37.352731313Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:37.356252 containerd[1468]: time="2025-02-13T15:57:37.355094535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:5,}" Feb 13 15:57:37.355705 systemd[1]: run-netns-cni\x2da1bfef38\x2de3b9\x2d41de\x2dba6b\x2d1656aea1b766.mount: Deactivated successfully. Feb 13 15:57:37.360553 containerd[1468]: time="2025-02-13T15:57:37.359367436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:1,}" Feb 13 15:57:37.581532 containerd[1468]: time="2025-02-13T15:57:37.581305731Z" level=error msg="Failed to destroy network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.583417 containerd[1468]: time="2025-02-13T15:57:37.583327809Z" level=error msg="encountered an error cleaning up failed sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.583613 containerd[1468]: time="2025-02-13T15:57:37.583524057Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.583949 kubelet[1782]: E0213 15:57:37.583885 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.584226 kubelet[1782]: E0213 15:57:37.583979 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:37.584226 kubelet[1782]: E0213 15:57:37.584016 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:37.584667 kubelet[1782]: E0213 15:57:37.584218 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:37.591057 containerd[1468]: time="2025-02-13T15:57:37.590647242Z" level=error msg="Failed to destroy network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.593071 containerd[1468]: time="2025-02-13T15:57:37.591964517Z" level=error msg="encountered an error cleaning up failed sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.593071 containerd[1468]: time="2025-02-13T15:57:37.592844627Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:1,} failed, error" error="failed to setup network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.593470 kubelet[1782]: E0213 15:57:37.593395 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:37.593580 kubelet[1782]: E0213 15:57:37.593504 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:37.593580 kubelet[1782]: E0213 15:57:37.593534 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:37.593660 kubelet[1782]: E0213 15:57:37.593600 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:37.977318 kubelet[1782]: E0213 15:57:37.976852 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:38.334192 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4-shm.mount: Deactivated successfully. Feb 13 15:57:38.334325 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2-shm.mount: Deactivated successfully. Feb 13 15:57:38.357652 kubelet[1782]: I0213 15:57:38.357435 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2" Feb 13 15:57:38.360527 containerd[1468]: time="2025-02-13T15:57:38.360405301Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:57:38.363479 containerd[1468]: time="2025-02-13T15:57:38.360814459Z" level=info msg="Ensure that sandbox e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2 in task-service has been cleanup successfully" Feb 13 15:57:38.365958 systemd[1]: run-netns-cni\x2d9426e40a\x2d561e\x2d2a20\x2d2848\x2db46d380f9f2c.mount: Deactivated successfully. Feb 13 15:57:38.366882 kubelet[1782]: I0213 15:57:38.366461 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4" Feb 13 15:57:38.368423 containerd[1468]: time="2025-02-13T15:57:38.368076832Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:38.369919 containerd[1468]: time="2025-02-13T15:57:38.369869877Z" level=info msg="Ensure that sandbox 63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4 in task-service has been cleanup successfully" Feb 13 15:57:38.372498 systemd[1]: run-netns-cni\x2d2cc41816\x2d0a01\x2dddbf\x2d4136\x2d98c464707b19.mount: Deactivated successfully. Feb 13 15:57:38.375028 containerd[1468]: time="2025-02-13T15:57:38.374239087Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:38.375028 containerd[1468]: time="2025-02-13T15:57:38.374294937Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:38.375028 containerd[1468]: time="2025-02-13T15:57:38.374673814Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:57:38.375028 containerd[1468]: time="2025-02-13T15:57:38.374700098Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:57:38.376401 containerd[1468]: time="2025-02-13T15:57:38.375935052Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:38.376401 containerd[1468]: time="2025-02-13T15:57:38.376085929Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:38.376401 containerd[1468]: time="2025-02-13T15:57:38.376105108Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:38.376700 containerd[1468]: time="2025-02-13T15:57:38.376399050Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:38.376700 containerd[1468]: time="2025-02-13T15:57:38.376525207Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:38.376700 containerd[1468]: time="2025-02-13T15:57:38.376542494Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:38.377530 containerd[1468]: time="2025-02-13T15:57:38.377089319Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:38.377824 containerd[1468]: time="2025-02-13T15:57:38.377733271Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:38.377824 containerd[1468]: time="2025-02-13T15:57:38.377764349Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:38.379718 containerd[1468]: time="2025-02-13T15:57:38.379670187Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:38.379841 containerd[1468]: time="2025-02-13T15:57:38.379818852Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:38.379841 containerd[1468]: time="2025-02-13T15:57:38.379836561Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:38.380008 containerd[1468]: time="2025-02-13T15:57:38.379982860Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:2,}" Feb 13 15:57:38.381404 containerd[1468]: time="2025-02-13T15:57:38.381371811Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:38.381742 containerd[1468]: time="2025-02-13T15:57:38.381606143Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:38.381742 containerd[1468]: time="2025-02-13T15:57:38.381629588Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:38.382456 containerd[1468]: time="2025-02-13T15:57:38.382367998Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:38.382783 containerd[1468]: time="2025-02-13T15:57:38.382670863Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:38.382783 containerd[1468]: time="2025-02-13T15:57:38.382693203Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:38.384370 containerd[1468]: time="2025-02-13T15:57:38.384319528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:6,}" Feb 13 15:57:38.594543 containerd[1468]: time="2025-02-13T15:57:38.594236926Z" level=error msg="Failed to destroy network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.596101 containerd[1468]: time="2025-02-13T15:57:38.595641738Z" level=error msg="encountered an error cleaning up failed sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.597191 containerd[1468]: time="2025-02-13T15:57:38.596455056Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:2,} failed, error" error="failed to setup network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.597610 kubelet[1782]: E0213 15:57:38.597568 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.597849 kubelet[1782]: E0213 15:57:38.597808 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:38.598022 kubelet[1782]: E0213 15:57:38.597994 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:38.599029 kubelet[1782]: E0213 15:57:38.598892 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:38.629288 containerd[1468]: time="2025-02-13T15:57:38.629182509Z" level=error msg="Failed to destroy network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.630142 containerd[1468]: time="2025-02-13T15:57:38.630003395Z" level=error msg="encountered an error cleaning up failed sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.630488 containerd[1468]: time="2025-02-13T15:57:38.630341821Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.630868 kubelet[1782]: E0213 15:57:38.630808 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:38.630967 kubelet[1782]: E0213 15:57:38.630887 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:38.630967 kubelet[1782]: E0213 15:57:38.630932 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:38.631070 kubelet[1782]: E0213 15:57:38.631005 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:38.979179 kubelet[1782]: E0213 15:57:38.977726 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:39.334294 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646-shm.mount: Deactivated successfully. Feb 13 15:57:39.388194 kubelet[1782]: I0213 15:57:39.387761 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65" Feb 13 15:57:39.393222 containerd[1468]: time="2025-02-13T15:57:39.389591463Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:57:39.393222 containerd[1468]: time="2025-02-13T15:57:39.389923827Z" level=info msg="Ensure that sandbox a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65 in task-service has been cleanup successfully" Feb 13 15:57:39.393875 containerd[1468]: time="2025-02-13T15:57:39.393608514Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:57:39.393875 containerd[1468]: time="2025-02-13T15:57:39.393672492Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:57:39.395953 systemd[1]: run-netns-cni\x2d167d44ac\x2d5697\x2d05fd\x2d402b\x2d1aa84692dfcc.mount: Deactivated successfully. Feb 13 15:57:39.402610 containerd[1468]: time="2025-02-13T15:57:39.401444083Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:57:39.402610 containerd[1468]: time="2025-02-13T15:57:39.401605063Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:57:39.402610 containerd[1468]: time="2025-02-13T15:57:39.401685297Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:57:39.404163 containerd[1468]: time="2025-02-13T15:57:39.403790271Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:39.404163 containerd[1468]: time="2025-02-13T15:57:39.403974804Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:39.404163 containerd[1468]: time="2025-02-13T15:57:39.404161795Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:39.405547 containerd[1468]: time="2025-02-13T15:57:39.405263181Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:39.405547 containerd[1468]: time="2025-02-13T15:57:39.405410644Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:39.405547 containerd[1468]: time="2025-02-13T15:57:39.405429220Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:39.406975 containerd[1468]: time="2025-02-13T15:57:39.406611320Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:39.406975 containerd[1468]: time="2025-02-13T15:57:39.406895623Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:39.407522 kubelet[1782]: I0213 15:57:39.407101 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646" Feb 13 15:57:39.407978 containerd[1468]: time="2025-02-13T15:57:39.407720878Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:39.409796 containerd[1468]: time="2025-02-13T15:57:39.409559719Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:39.410048 containerd[1468]: time="2025-02-13T15:57:39.410023550Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:39.410202 containerd[1468]: time="2025-02-13T15:57:39.410182525Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:39.410810 containerd[1468]: time="2025-02-13T15:57:39.409689337Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:57:39.410810 containerd[1468]: time="2025-02-13T15:57:39.410628838Z" level=info msg="Ensure that sandbox 85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646 in task-service has been cleanup successfully" Feb 13 15:57:39.412250 containerd[1468]: time="2025-02-13T15:57:39.412215022Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:57:39.412812 containerd[1468]: time="2025-02-13T15:57:39.412782361Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:57:39.414283 containerd[1468]: time="2025-02-13T15:57:39.412865468Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:39.414531 containerd[1468]: time="2025-02-13T15:57:39.414508848Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:39.414890 containerd[1468]: time="2025-02-13T15:57:39.414866564Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:39.415453 systemd[1]: run-netns-cni\x2db52baa48\x2d65b7\x2dc61d\x2deb09\x2db9796e1cb9bd.mount: Deactivated successfully. Feb 13 15:57:39.418207 containerd[1468]: time="2025-02-13T15:57:39.417764540Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:39.418207 containerd[1468]: time="2025-02-13T15:57:39.418004876Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:39.418207 containerd[1468]: time="2025-02-13T15:57:39.418029701Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:39.418412 containerd[1468]: time="2025-02-13T15:57:39.418225009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:7,}" Feb 13 15:57:39.421955 containerd[1468]: time="2025-02-13T15:57:39.421902506Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:39.423093 containerd[1468]: time="2025-02-13T15:57:39.423055540Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:39.424230 containerd[1468]: time="2025-02-13T15:57:39.423777161Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:39.425366 containerd[1468]: time="2025-02-13T15:57:39.424851162Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:3,}" Feb 13 15:57:39.653178 containerd[1468]: time="2025-02-13T15:57:39.652691766Z" level=error msg="Failed to destroy network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.655090 containerd[1468]: time="2025-02-13T15:57:39.655016999Z" level=error msg="encountered an error cleaning up failed sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.655317 containerd[1468]: time="2025-02-13T15:57:39.655161900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.656003 kubelet[1782]: E0213 15:57:39.655765 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.656003 kubelet[1782]: E0213 15:57:39.655865 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:39.656003 kubelet[1782]: E0213 15:57:39.655896 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:39.656447 kubelet[1782]: E0213 15:57:39.655975 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:39.679818 containerd[1468]: time="2025-02-13T15:57:39.679703936Z" level=error msg="Failed to destroy network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.681692 containerd[1468]: time="2025-02-13T15:57:39.681610444Z" level=error msg="encountered an error cleaning up failed sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.683887 containerd[1468]: time="2025-02-13T15:57:39.682472086Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:3,} failed, error" error="failed to setup network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.684161 kubelet[1782]: E0213 15:57:39.683334 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:39.684161 kubelet[1782]: E0213 15:57:39.683419 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:39.684161 kubelet[1782]: E0213 15:57:39.683453 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:39.684312 kubelet[1782]: E0213 15:57:39.683516 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:39.978932 kubelet[1782]: E0213 15:57:39.978854 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:40.335491 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74-shm.mount: Deactivated successfully. Feb 13 15:57:40.415958 kubelet[1782]: I0213 15:57:40.415914 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1" Feb 13 15:57:40.422163 containerd[1468]: time="2025-02-13T15:57:40.417172871Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:57:40.422163 containerd[1468]: time="2025-02-13T15:57:40.417682077Z" level=info msg="Ensure that sandbox f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1 in task-service has been cleanup successfully" Feb 13 15:57:40.425265 systemd[1]: run-netns-cni\x2dfb013582\x2d6083\x2dc6c5\x2d722f\x2dfe61f33d218c.mount: Deactivated successfully. Feb 13 15:57:40.426713 containerd[1468]: time="2025-02-13T15:57:40.425329877Z" level=info msg="TearDown network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" successfully" Feb 13 15:57:40.426713 containerd[1468]: time="2025-02-13T15:57:40.425381590Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" returns successfully" Feb 13 15:57:40.433936 containerd[1468]: time="2025-02-13T15:57:40.433851324Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:57:40.434230 containerd[1468]: time="2025-02-13T15:57:40.434011060Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:57:40.434230 containerd[1468]: time="2025-02-13T15:57:40.434029420Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:57:40.436448 containerd[1468]: time="2025-02-13T15:57:40.436199666Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:57:40.436448 containerd[1468]: time="2025-02-13T15:57:40.436333499Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:57:40.436448 containerd[1468]: time="2025-02-13T15:57:40.436350314Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:57:40.438103 containerd[1468]: time="2025-02-13T15:57:40.437501795Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:40.438103 containerd[1468]: time="2025-02-13T15:57:40.437646431Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:40.438103 containerd[1468]: time="2025-02-13T15:57:40.437662817Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:40.438364 containerd[1468]: time="2025-02-13T15:57:40.438194569Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:40.438364 containerd[1468]: time="2025-02-13T15:57:40.438302544Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:40.438364 containerd[1468]: time="2025-02-13T15:57:40.438316737Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:40.439801 kubelet[1782]: I0213 15:57:40.438658 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74" Feb 13 15:57:40.440088 containerd[1468]: time="2025-02-13T15:57:40.440050655Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:40.440223 containerd[1468]: time="2025-02-13T15:57:40.440192624Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:40.440223 containerd[1468]: time="2025-02-13T15:57:40.440209316Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:40.440297 containerd[1468]: time="2025-02-13T15:57:40.440278259Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:57:40.441685 containerd[1468]: time="2025-02-13T15:57:40.441427136Z" level=info msg="Ensure that sandbox e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74 in task-service has been cleanup successfully" Feb 13 15:57:40.441837 containerd[1468]: time="2025-02-13T15:57:40.441812904Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:57:40.441901 containerd[1468]: time="2025-02-13T15:57:40.441837117Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:57:40.445202 containerd[1468]: time="2025-02-13T15:57:40.443683614Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:57:40.445202 containerd[1468]: time="2025-02-13T15:57:40.443815360Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:57:40.445202 containerd[1468]: time="2025-02-13T15:57:40.443836802Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:57:40.445202 containerd[1468]: time="2025-02-13T15:57:40.443934266Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:40.445202 containerd[1468]: time="2025-02-13T15:57:40.444016981Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:40.445202 containerd[1468]: time="2025-02-13T15:57:40.444035134Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:40.447836 systemd[1]: run-netns-cni\x2da6a077b6\x2d0d3d\x2dcb45\x2d07c1\x2d3a6f81c4dd30.mount: Deactivated successfully. Feb 13 15:57:40.450595 containerd[1468]: time="2025-02-13T15:57:40.449873869Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:40.450595 containerd[1468]: time="2025-02-13T15:57:40.450027167Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:40.450595 containerd[1468]: time="2025-02-13T15:57:40.450045460Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:40.450846 containerd[1468]: time="2025-02-13T15:57:40.450687903Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:40.450846 containerd[1468]: time="2025-02-13T15:57:40.450817952Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:40.450846 containerd[1468]: time="2025-02-13T15:57:40.450835253Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:40.453047 containerd[1468]: time="2025-02-13T15:57:40.452398030Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:40.453047 containerd[1468]: time="2025-02-13T15:57:40.452544196Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:40.453047 containerd[1468]: time="2025-02-13T15:57:40.452563899Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:40.453329 containerd[1468]: time="2025-02-13T15:57:40.453211984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:8,}" Feb 13 15:57:40.453764 containerd[1468]: time="2025-02-13T15:57:40.453617139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:4,}" Feb 13 15:57:40.657500 containerd[1468]: time="2025-02-13T15:57:40.656254904Z" level=error msg="Failed to destroy network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.657500 containerd[1468]: time="2025-02-13T15:57:40.656723299Z" level=error msg="encountered an error cleaning up failed sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.657500 containerd[1468]: time="2025-02-13T15:57:40.656796372Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:8,} failed, error" error="failed to setup network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.657824 kubelet[1782]: E0213 15:57:40.657771 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.658133 kubelet[1782]: E0213 15:57:40.658021 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:40.658133 kubelet[1782]: E0213 15:57:40.658067 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:40.658133 kubelet[1782]: E0213 15:57:40.658113 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:40.679465 containerd[1468]: time="2025-02-13T15:57:40.679194766Z" level=error msg="Failed to destroy network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.680053 containerd[1468]: time="2025-02-13T15:57:40.679951435Z" level=error msg="encountered an error cleaning up failed sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.680433 containerd[1468]: time="2025-02-13T15:57:40.680252832Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:4,} failed, error" error="failed to setup network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.680699 kubelet[1782]: E0213 15:57:40.680621 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:40.680861 kubelet[1782]: E0213 15:57:40.680719 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:40.680861 kubelet[1782]: E0213 15:57:40.680752 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:40.680861 kubelet[1782]: E0213 15:57:40.680826 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:40.980058 kubelet[1782]: E0213 15:57:40.979843 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:41.334928 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0-shm.mount: Deactivated successfully. Feb 13 15:57:41.335064 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266-shm.mount: Deactivated successfully. Feb 13 15:57:41.461535 kubelet[1782]: I0213 15:57:41.460793 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266" Feb 13 15:57:41.463380 containerd[1468]: time="2025-02-13T15:57:41.463031030Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" Feb 13 15:57:41.463791 containerd[1468]: time="2025-02-13T15:57:41.463689737Z" level=info msg="Ensure that sandbox 3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266 in task-service has been cleanup successfully" Feb 13 15:57:41.468405 containerd[1468]: time="2025-02-13T15:57:41.467999431Z" level=info msg="TearDown network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" successfully" Feb 13 15:57:41.468405 containerd[1468]: time="2025-02-13T15:57:41.468055516Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" returns successfully" Feb 13 15:57:41.469313 containerd[1468]: time="2025-02-13T15:57:41.469247007Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:57:41.469647 containerd[1468]: time="2025-02-13T15:57:41.469513282Z" level=info msg="TearDown network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" successfully" Feb 13 15:57:41.469647 containerd[1468]: time="2025-02-13T15:57:41.469532014Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" returns successfully" Feb 13 15:57:41.470323 systemd[1]: run-netns-cni\x2d0b974a5d\x2d50ba\x2daf9e\x2db23e\x2d8a2c94abf479.mount: Deactivated successfully. Feb 13 15:57:41.472391 containerd[1468]: time="2025-02-13T15:57:41.471807382Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:57:41.472391 containerd[1468]: time="2025-02-13T15:57:41.471928018Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:57:41.472391 containerd[1468]: time="2025-02-13T15:57:41.471939915Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:57:41.473821 containerd[1468]: time="2025-02-13T15:57:41.473685072Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:57:41.473821 containerd[1468]: time="2025-02-13T15:57:41.473812439Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:57:41.473821 containerd[1468]: time="2025-02-13T15:57:41.473824822Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:57:41.475959 containerd[1468]: time="2025-02-13T15:57:41.475877357Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:41.476170 containerd[1468]: time="2025-02-13T15:57:41.475992264Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:41.476170 containerd[1468]: time="2025-02-13T15:57:41.476010071Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:41.477212 kubelet[1782]: I0213 15:57:41.476631 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0" Feb 13 15:57:41.478015 containerd[1468]: time="2025-02-13T15:57:41.477980321Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:57:41.478896 containerd[1468]: time="2025-02-13T15:57:41.478869526Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:41.478992 containerd[1468]: time="2025-02-13T15:57:41.478974989Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:41.479035 containerd[1468]: time="2025-02-13T15:57:41.478993132Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:41.479620 containerd[1468]: time="2025-02-13T15:57:41.479587176Z" level=info msg="Ensure that sandbox dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0 in task-service has been cleanup successfully" Feb 13 15:57:41.479865 containerd[1468]: time="2025-02-13T15:57:41.479829912Z" level=info msg="TearDown network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" successfully" Feb 13 15:57:41.479865 containerd[1468]: time="2025-02-13T15:57:41.479851676Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" returns successfully" Feb 13 15:57:41.484515 containerd[1468]: time="2025-02-13T15:57:41.482968896Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:57:41.484515 containerd[1468]: time="2025-02-13T15:57:41.483112770Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:57:41.484515 containerd[1468]: time="2025-02-13T15:57:41.483924566Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:41.484515 containerd[1468]: time="2025-02-13T15:57:41.484454850Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:41.484515 containerd[1468]: time="2025-02-13T15:57:41.484468106Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:41.483785 systemd[1]: run-netns-cni\x2d9111c5c5\x2d4513\x2d9849\x2db124\x2d26a962384440.mount: Deactivated successfully. Feb 13 15:57:41.486166 containerd[1468]: time="2025-02-13T15:57:41.486036916Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:57:41.486491 containerd[1468]: time="2025-02-13T15:57:41.486056264Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:41.486697 containerd[1468]: time="2025-02-13T15:57:41.486612441Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:41.486776 containerd[1468]: time="2025-02-13T15:57:41.486698186Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:41.487640 containerd[1468]: time="2025-02-13T15:57:41.487602149Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:41.487770 containerd[1468]: time="2025-02-13T15:57:41.487746345Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:41.487826 containerd[1468]: time="2025-02-13T15:57:41.487767053Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:41.487864 containerd[1468]: time="2025-02-13T15:57:41.487840422Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:57:41.487938 containerd[1468]: time="2025-02-13T15:57:41.487917879Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:57:41.487982 containerd[1468]: time="2025-02-13T15:57:41.487936596Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:57:41.488749 containerd[1468]: time="2025-02-13T15:57:41.488702948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:9,}" Feb 13 15:57:41.489709 containerd[1468]: time="2025-02-13T15:57:41.489575898Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:41.489899 containerd[1468]: time="2025-02-13T15:57:41.489877286Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:41.490028 containerd[1468]: time="2025-02-13T15:57:41.489966491Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:41.491047 containerd[1468]: time="2025-02-13T15:57:41.490635269Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:41.491047 containerd[1468]: time="2025-02-13T15:57:41.490750378Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:41.491047 containerd[1468]: time="2025-02-13T15:57:41.490762181Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:41.492675 containerd[1468]: time="2025-02-13T15:57:41.492596485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:5,}" Feb 13 15:57:41.816360 containerd[1468]: time="2025-02-13T15:57:41.816228714Z" level=error msg="Failed to destroy network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.817720 containerd[1468]: time="2025-02-13T15:57:41.817566129Z" level=error msg="encountered an error cleaning up failed sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.817720 containerd[1468]: time="2025-02-13T15:57:41.817659148Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:5,} failed, error" error="failed to setup network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.818639 kubelet[1782]: E0213 15:57:41.817979 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.818639 kubelet[1782]: E0213 15:57:41.818069 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:41.818639 kubelet[1782]: E0213 15:57:41.818101 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:41.818778 kubelet[1782]: E0213 15:57:41.818184 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:41.868532 containerd[1468]: time="2025-02-13T15:57:41.867538464Z" level=error msg="Failed to destroy network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.868532 containerd[1468]: time="2025-02-13T15:57:41.868456393Z" level=error msg="encountered an error cleaning up failed sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.868532 containerd[1468]: time="2025-02-13T15:57:41.868547735Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:9,} failed, error" error="failed to setup network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.869227 kubelet[1782]: E0213 15:57:41.868938 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:41.869227 kubelet[1782]: E0213 15:57:41.869024 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:41.869227 kubelet[1782]: E0213 15:57:41.869049 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:41.869393 kubelet[1782]: E0213 15:57:41.869101 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:42.002590 kubelet[1782]: E0213 15:57:42.002527 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:42.335722 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc-shm.mount: Deactivated successfully. Feb 13 15:57:42.335918 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2-shm.mount: Deactivated successfully. Feb 13 15:57:42.499166 kubelet[1782]: I0213 15:57:42.498470 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2" Feb 13 15:57:42.499988 containerd[1468]: time="2025-02-13T15:57:42.499943953Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\"" Feb 13 15:57:42.501325 containerd[1468]: time="2025-02-13T15:57:42.500750177Z" level=info msg="Ensure that sandbox fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2 in task-service has been cleanup successfully" Feb 13 15:57:42.501325 containerd[1468]: time="2025-02-13T15:57:42.501080707Z" level=info msg="TearDown network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" successfully" Feb 13 15:57:42.501325 containerd[1468]: time="2025-02-13T15:57:42.501107844Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" returns successfully" Feb 13 15:57:42.504750 systemd[1]: run-netns-cni\x2d1c6623c9\x2dae3b\x2de2cc\x2d0704\x2dc88907f8ede3.mount: Deactivated successfully. Feb 13 15:57:42.509529 containerd[1468]: time="2025-02-13T15:57:42.509473316Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" Feb 13 15:57:42.514370 containerd[1468]: time="2025-02-13T15:57:42.514000993Z" level=info msg="TearDown network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" successfully" Feb 13 15:57:42.514370 containerd[1468]: time="2025-02-13T15:57:42.514091016Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" returns successfully" Feb 13 15:57:42.515345 kubelet[1782]: I0213 15:57:42.515079 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc" Feb 13 15:57:42.516579 containerd[1468]: time="2025-02-13T15:57:42.516235976Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:57:42.517294 containerd[1468]: time="2025-02-13T15:57:42.517153479Z" level=info msg="TearDown network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" successfully" Feb 13 15:57:42.517294 containerd[1468]: time="2025-02-13T15:57:42.517230449Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" returns successfully" Feb 13 15:57:42.518461 containerd[1468]: time="2025-02-13T15:57:42.517605790Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:57:42.518461 containerd[1468]: time="2025-02-13T15:57:42.517732966Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:57:42.518461 containerd[1468]: time="2025-02-13T15:57:42.517753067Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:57:42.518461 containerd[1468]: time="2025-02-13T15:57:42.517868086Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" Feb 13 15:57:42.518461 containerd[1468]: time="2025-02-13T15:57:42.518098817Z" level=info msg="Ensure that sandbox 5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc in task-service has been cleanup successfully" Feb 13 15:57:42.519401 containerd[1468]: time="2025-02-13T15:57:42.519340933Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:57:42.519511 containerd[1468]: time="2025-02-13T15:57:42.519463957Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:57:42.519511 containerd[1468]: time="2025-02-13T15:57:42.519480141Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:57:42.521542 containerd[1468]: time="2025-02-13T15:57:42.521466891Z" level=info msg="TearDown network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" successfully" Feb 13 15:57:42.521926 containerd[1468]: time="2025-02-13T15:57:42.521891607Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" returns successfully" Feb 13 15:57:42.522020 containerd[1468]: time="2025-02-13T15:57:42.521879563Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:42.522101 containerd[1468]: time="2025-02-13T15:57:42.522076474Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:42.522194 containerd[1468]: time="2025-02-13T15:57:42.522099867Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:42.523419 systemd[1]: run-netns-cni\x2de870f17f\x2d09e1\x2db612\x2d9930\x2dd4a2297f804d.mount: Deactivated successfully. Feb 13 15:57:42.525059 containerd[1468]: time="2025-02-13T15:57:42.524942446Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:42.525207 containerd[1468]: time="2025-02-13T15:57:42.525069515Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:57:42.525258 containerd[1468]: time="2025-02-13T15:57:42.525216458Z" level=info msg="TearDown network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" successfully" Feb 13 15:57:42.525258 containerd[1468]: time="2025-02-13T15:57:42.525233601Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" returns successfully" Feb 13 15:57:42.526970 containerd[1468]: time="2025-02-13T15:57:42.525702935Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:57:42.526970 containerd[1468]: time="2025-02-13T15:57:42.525819274Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:57:42.526970 containerd[1468]: time="2025-02-13T15:57:42.525835199Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:57:42.526970 containerd[1468]: time="2025-02-13T15:57:42.526821069Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:57:42.526970 containerd[1468]: time="2025-02-13T15:57:42.526939942Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:57:42.526970 containerd[1468]: time="2025-02-13T15:57:42.526964411Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:57:42.528286 containerd[1468]: time="2025-02-13T15:57:42.528096552Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:42.528286 containerd[1468]: time="2025-02-13T15:57:42.528184366Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:42.528964 containerd[1468]: time="2025-02-13T15:57:42.528446571Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:42.528964 containerd[1468]: time="2025-02-13T15:57:42.528556084Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:42.528964 containerd[1468]: time="2025-02-13T15:57:42.528572378Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:42.531716 containerd[1468]: time="2025-02-13T15:57:42.531658299Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:42.532455 containerd[1468]: time="2025-02-13T15:57:42.531743685Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:42.532455 containerd[1468]: time="2025-02-13T15:57:42.532330542Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:42.532455 containerd[1468]: time="2025-02-13T15:57:42.532356743Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:42.532455 containerd[1468]: time="2025-02-13T15:57:42.532371174Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:42.532455 containerd[1468]: time="2025-02-13T15:57:42.532383115Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:42.533860 containerd[1468]: time="2025-02-13T15:57:42.533648582Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:42.534406 containerd[1468]: time="2025-02-13T15:57:42.534177555Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:6,}" Feb 13 15:57:42.535095 containerd[1468]: time="2025-02-13T15:57:42.535063512Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:42.536014 containerd[1468]: time="2025-02-13T15:57:42.535974131Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:42.537776 containerd[1468]: time="2025-02-13T15:57:42.537645102Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:42.537865 containerd[1468]: time="2025-02-13T15:57:42.537796443Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:42.537865 containerd[1468]: time="2025-02-13T15:57:42.537812539Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:42.541037 containerd[1468]: time="2025-02-13T15:57:42.540712108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:10,}" Feb 13 15:57:42.792327 containerd[1468]: time="2025-02-13T15:57:42.790882733Z" level=error msg="Failed to destroy network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.793603 containerd[1468]: time="2025-02-13T15:57:42.793552324Z" level=error msg="encountered an error cleaning up failed sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.793829 containerd[1468]: time="2025-02-13T15:57:42.793800911Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:10,} failed, error" error="failed to setup network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.794764 kubelet[1782]: E0213 15:57:42.794714 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.795879 kubelet[1782]: E0213 15:57:42.795053 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:42.795879 kubelet[1782]: E0213 15:57:42.795094 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:42.795879 kubelet[1782]: E0213 15:57:42.795314 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:42.853969 containerd[1468]: time="2025-02-13T15:57:42.853735430Z" level=error msg="Failed to destroy network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.854448 containerd[1468]: time="2025-02-13T15:57:42.854255170Z" level=error msg="encountered an error cleaning up failed sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.854448 containerd[1468]: time="2025-02-13T15:57:42.854353207Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:6,} failed, error" error="failed to setup network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.855820 kubelet[1782]: E0213 15:57:42.854639 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:42.855820 kubelet[1782]: E0213 15:57:42.854728 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:42.855820 kubelet[1782]: E0213 15:57:42.854756 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:42.855996 kubelet[1782]: E0213 15:57:42.854847 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:43.004575 kubelet[1782]: E0213 15:57:43.004299 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:43.336695 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3-shm.mount: Deactivated successfully. Feb 13 15:57:43.337630 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75-shm.mount: Deactivated successfully. Feb 13 15:57:43.526267 kubelet[1782]: I0213 15:57:43.525197 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75" Feb 13 15:57:43.526840 containerd[1468]: time="2025-02-13T15:57:43.526595088Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\"" Feb 13 15:57:43.527367 containerd[1468]: time="2025-02-13T15:57:43.526854842Z" level=info msg="Ensure that sandbox 694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75 in task-service has been cleanup successfully" Feb 13 15:57:43.530421 containerd[1468]: time="2025-02-13T15:57:43.528218834Z" level=info msg="TearDown network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" successfully" Feb 13 15:57:43.530421 containerd[1468]: time="2025-02-13T15:57:43.528262528Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" returns successfully" Feb 13 15:57:43.530632 containerd[1468]: time="2025-02-13T15:57:43.530556561Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" Feb 13 15:57:43.531181 containerd[1468]: time="2025-02-13T15:57:43.530703041Z" level=info msg="TearDown network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" successfully" Feb 13 15:57:43.531181 containerd[1468]: time="2025-02-13T15:57:43.530732583Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" returns successfully" Feb 13 15:57:43.532293 containerd[1468]: time="2025-02-13T15:57:43.531513198Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:57:43.532293 containerd[1468]: time="2025-02-13T15:57:43.531630408Z" level=info msg="TearDown network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" successfully" Feb 13 15:57:43.532293 containerd[1468]: time="2025-02-13T15:57:43.531645999Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" returns successfully" Feb 13 15:57:43.532007 systemd[1]: run-netns-cni\x2da05aaaee\x2dcf89\x2d6d47\x2d5db2\x2db67e9f1b8f17.mount: Deactivated successfully. Feb 13 15:57:43.536150 containerd[1468]: time="2025-02-13T15:57:43.534921258Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:57:43.536150 containerd[1468]: time="2025-02-13T15:57:43.535065903Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:57:43.536150 containerd[1468]: time="2025-02-13T15:57:43.535083134Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:57:43.536150 containerd[1468]: time="2025-02-13T15:57:43.535751401Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:57:43.536150 containerd[1468]: time="2025-02-13T15:57:43.535871470Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:57:43.536150 containerd[1468]: time="2025-02-13T15:57:43.535888310Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:57:43.536807 containerd[1468]: time="2025-02-13T15:57:43.536497467Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:43.536807 containerd[1468]: time="2025-02-13T15:57:43.536602517Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:43.536807 containerd[1468]: time="2025-02-13T15:57:43.536619247Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:43.537363 containerd[1468]: time="2025-02-13T15:57:43.537319492Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:43.537443 containerd[1468]: time="2025-02-13T15:57:43.537426568Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:43.537472 containerd[1468]: time="2025-02-13T15:57:43.537445374Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:43.539189 containerd[1468]: time="2025-02-13T15:57:43.538175568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:7,}" Feb 13 15:57:43.550933 kubelet[1782]: I0213 15:57:43.550745 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3" Feb 13 15:57:43.551893 containerd[1468]: time="2025-02-13T15:57:43.551850434Z" level=info msg="StopPodSandbox for \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\"" Feb 13 15:57:43.553328 containerd[1468]: time="2025-02-13T15:57:43.553275788Z" level=info msg="Ensure that sandbox 9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3 in task-service has been cleanup successfully" Feb 13 15:57:43.558465 systemd[1]: run-netns-cni\x2dc2c9d4bf\x2dba3a\x2ddb24\x2dd2a0\x2df5f601b8c04f.mount: Deactivated successfully. Feb 13 15:57:43.559936 containerd[1468]: time="2025-02-13T15:57:43.559600369Z" level=info msg="TearDown network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" successfully" Feb 13 15:57:43.559936 containerd[1468]: time="2025-02-13T15:57:43.559647084Z" level=info msg="StopPodSandbox for \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" returns successfully" Feb 13 15:57:43.560994 containerd[1468]: time="2025-02-13T15:57:43.560933105Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\"" Feb 13 15:57:43.561408 containerd[1468]: time="2025-02-13T15:57:43.561107711Z" level=info msg="TearDown network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" successfully" Feb 13 15:57:43.561502 containerd[1468]: time="2025-02-13T15:57:43.561410367Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" returns successfully" Feb 13 15:57:43.562092 containerd[1468]: time="2025-02-13T15:57:43.562064430Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" Feb 13 15:57:43.562399 containerd[1468]: time="2025-02-13T15:57:43.562370497Z" level=info msg="TearDown network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" successfully" Feb 13 15:57:43.562476 containerd[1468]: time="2025-02-13T15:57:43.562464533Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" returns successfully" Feb 13 15:57:43.563609 containerd[1468]: time="2025-02-13T15:57:43.563567930Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:57:43.564163 containerd[1468]: time="2025-02-13T15:57:43.564110877Z" level=info msg="TearDown network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" successfully" Feb 13 15:57:43.565842 containerd[1468]: time="2025-02-13T15:57:43.564309191Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" returns successfully" Feb 13 15:57:43.565842 containerd[1468]: time="2025-02-13T15:57:43.565265134Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:57:43.565842 containerd[1468]: time="2025-02-13T15:57:43.565424033Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:57:43.565842 containerd[1468]: time="2025-02-13T15:57:43.565466753Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:57:43.568158 containerd[1468]: time="2025-02-13T15:57:43.566450163Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:57:43.568443 containerd[1468]: time="2025-02-13T15:57:43.568417626Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:57:43.568513 containerd[1468]: time="2025-02-13T15:57:43.568502425Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:57:43.570079 containerd[1468]: time="2025-02-13T15:57:43.570013258Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:43.570411 containerd[1468]: time="2025-02-13T15:57:43.570376171Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:43.570453 containerd[1468]: time="2025-02-13T15:57:43.570414400Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:43.571023 containerd[1468]: time="2025-02-13T15:57:43.570983054Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:43.571167 containerd[1468]: time="2025-02-13T15:57:43.571143296Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:43.571247 containerd[1468]: time="2025-02-13T15:57:43.571167127Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:43.571767 containerd[1468]: time="2025-02-13T15:57:43.571573051Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:43.571767 containerd[1468]: time="2025-02-13T15:57:43.571680752Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:43.571957 containerd[1468]: time="2025-02-13T15:57:43.571855654Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:43.572417 containerd[1468]: time="2025-02-13T15:57:43.572380835Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:43.577393 containerd[1468]: time="2025-02-13T15:57:43.577311162Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:43.577393 containerd[1468]: time="2025-02-13T15:57:43.577364854Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:43.580315 containerd[1468]: time="2025-02-13T15:57:43.580219648Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:43.580483 containerd[1468]: time="2025-02-13T15:57:43.580442228Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:43.589567 containerd[1468]: time="2025-02-13T15:57:43.580462922Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:43.601375 containerd[1468]: time="2025-02-13T15:57:43.601324426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:11,}" Feb 13 15:57:43.785775 containerd[1468]: time="2025-02-13T15:57:43.785697219Z" level=error msg="Failed to destroy network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.786544 containerd[1468]: time="2025-02-13T15:57:43.786490382Z" level=error msg="encountered an error cleaning up failed sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.786760 containerd[1468]: time="2025-02-13T15:57:43.786727746Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:7,} failed, error" error="failed to setup network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.787342 kubelet[1782]: E0213 15:57:43.787280 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.787512 kubelet[1782]: E0213 15:57:43.787387 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:43.787512 kubelet[1782]: E0213 15:57:43.787428 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:43.787854 kubelet[1782]: E0213 15:57:43.787495 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:43.794651 containerd[1468]: time="2025-02-13T15:57:43.793800768Z" level=error msg="Failed to destroy network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.794651 containerd[1468]: time="2025-02-13T15:57:43.794326134Z" level=error msg="encountered an error cleaning up failed sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.794651 containerd[1468]: time="2025-02-13T15:57:43.794432707Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:11,} failed, error" error="failed to setup network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.794938 kubelet[1782]: E0213 15:57:43.794808 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:43.794938 kubelet[1782]: E0213 15:57:43.794895 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:43.794938 kubelet[1782]: E0213 15:57:43.794925 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h6pm6" Feb 13 15:57:43.795091 kubelet[1782]: E0213 15:57:43.794990 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h6pm6_calico-system(120c5850-254f-4135-9f32-a6b8fe647e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h6pm6" podUID="120c5850-254f-4135-9f32-a6b8fe647e54" Feb 13 15:57:43.847025 kubelet[1782]: I0213 15:57:43.845959 1782 topology_manager.go:215] "Topology Admit Handler" podUID="f2062c4f-1abe-4f44-bd1f-3ab27324a022" podNamespace="calico-system" podName="calico-typha-8c7cb8495-7jdbw" Feb 13 15:57:43.862618 systemd[1]: Created slice kubepods-besteffort-podf2062c4f_1abe_4f44_bd1f_3ab27324a022.slice - libcontainer container kubepods-besteffort-podf2062c4f_1abe_4f44_bd1f_3ab27324a022.slice. Feb 13 15:57:43.929537 kubelet[1782]: I0213 15:57:43.927014 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljmf\" (UniqueName: \"kubernetes.io/projected/f2062c4f-1abe-4f44-bd1f-3ab27324a022-kube-api-access-mljmf\") pod \"calico-typha-8c7cb8495-7jdbw\" (UID: \"f2062c4f-1abe-4f44-bd1f-3ab27324a022\") " pod="calico-system/calico-typha-8c7cb8495-7jdbw" Feb 13 15:57:43.929537 kubelet[1782]: I0213 15:57:43.927240 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f2062c4f-1abe-4f44-bd1f-3ab27324a022-typha-certs\") pod \"calico-typha-8c7cb8495-7jdbw\" (UID: \"f2062c4f-1abe-4f44-bd1f-3ab27324a022\") " pod="calico-system/calico-typha-8c7cb8495-7jdbw" Feb 13 15:57:43.929537 kubelet[1782]: I0213 15:57:43.927280 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2062c4f-1abe-4f44-bd1f-3ab27324a022-tigera-ca-bundle\") pod \"calico-typha-8c7cb8495-7jdbw\" (UID: \"f2062c4f-1abe-4f44-bd1f-3ab27324a022\") " pod="calico-system/calico-typha-8c7cb8495-7jdbw" Feb 13 15:57:44.004760 kubelet[1782]: E0213 15:57:44.004686 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:44.128475 containerd[1468]: time="2025-02-13T15:57:44.127986000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:44.129864 containerd[1468]: time="2025-02-13T15:57:44.129627520Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 15:57:44.134501 containerd[1468]: time="2025-02-13T15:57:44.134375699Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:44.144763 containerd[1468]: time="2025-02-13T15:57:44.144309036Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:44.144763 containerd[1468]: time="2025-02-13T15:57:44.144594131Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 11.929955927s" Feb 13 15:57:44.144763 containerd[1468]: time="2025-02-13T15:57:44.144630340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 15:57:44.157570 containerd[1468]: time="2025-02-13T15:57:44.157473964Z" level=info msg="CreateContainer within sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 15:57:44.167961 kubelet[1782]: E0213 15:57:44.167903 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:44.168974 containerd[1468]: time="2025-02-13T15:57:44.168911851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8c7cb8495-7jdbw,Uid:f2062c4f-1abe-4f44-bd1f-3ab27324a022,Namespace:calico-system,Attempt:0,}" Feb 13 15:57:44.191687 containerd[1468]: time="2025-02-13T15:57:44.191345601Z" level=info msg="CreateContainer within sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\"" Feb 13 15:57:44.194489 containerd[1468]: time="2025-02-13T15:57:44.192811601Z" level=info msg="StartContainer for \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\"" Feb 13 15:57:44.285487 containerd[1468]: time="2025-02-13T15:57:44.285098786Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:44.286034 containerd[1468]: time="2025-02-13T15:57:44.285697441Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:44.286034 containerd[1468]: time="2025-02-13T15:57:44.285788016Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:44.286034 containerd[1468]: time="2025-02-13T15:57:44.285943454Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:44.344990 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6-shm.mount: Deactivated successfully. Feb 13 15:57:44.345182 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1704328086.mount: Deactivated successfully. Feb 13 15:57:44.407486 systemd[1]: Started cri-containerd-e93679725cd535357c73d3ab5453e2d80788af5aee96182c1232449bce842f0d.scope - libcontainer container e93679725cd535357c73d3ab5453e2d80788af5aee96182c1232449bce842f0d. Feb 13 15:57:44.412506 systemd[1]: Started cri-containerd-5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc.scope - libcontainer container 5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc. Feb 13 15:57:44.500506 containerd[1468]: time="2025-02-13T15:57:44.500442487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8c7cb8495-7jdbw,Uid:f2062c4f-1abe-4f44-bd1f-3ab27324a022,Namespace:calico-system,Attempt:0,} returns sandbox id \"e93679725cd535357c73d3ab5453e2d80788af5aee96182c1232449bce842f0d\"" Feb 13 15:57:44.501911 kubelet[1782]: E0213 15:57:44.501850 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:44.503884 containerd[1468]: time="2025-02-13T15:57:44.503766402Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Feb 13 15:57:44.526152 containerd[1468]: time="2025-02-13T15:57:44.525571181Z" level=info msg="StartContainer for \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\" returns successfully" Feb 13 15:57:44.565528 kubelet[1782]: I0213 15:57:44.565352 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6" Feb 13 15:57:44.567962 containerd[1468]: time="2025-02-13T15:57:44.567409428Z" level=info msg="StopPodSandbox for \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\"" Feb 13 15:57:44.567962 containerd[1468]: time="2025-02-13T15:57:44.567726392Z" level=info msg="Ensure that sandbox 467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6 in task-service has been cleanup successfully" Feb 13 15:57:44.573914 systemd[1]: run-netns-cni\x2dce6b8fef\x2dd884\x2d686c\x2d3d98\x2d29b572f9ef44.mount: Deactivated successfully. Feb 13 15:57:44.577495 containerd[1468]: time="2025-02-13T15:57:44.574251605Z" level=info msg="TearDown network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" successfully" Feb 13 15:57:44.577495 containerd[1468]: time="2025-02-13T15:57:44.574324502Z" level=info msg="StopPodSandbox for \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" returns successfully" Feb 13 15:57:44.579818 containerd[1468]: time="2025-02-13T15:57:44.579231604Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\"" Feb 13 15:57:44.582359 containerd[1468]: time="2025-02-13T15:57:44.582295850Z" level=info msg="TearDown network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" successfully" Feb 13 15:57:44.582359 containerd[1468]: time="2025-02-13T15:57:44.582346365Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" returns successfully" Feb 13 15:57:44.583474 containerd[1468]: time="2025-02-13T15:57:44.583206815Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" Feb 13 15:57:44.583474 containerd[1468]: time="2025-02-13T15:57:44.583346040Z" level=info msg="TearDown network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" successfully" Feb 13 15:57:44.583474 containerd[1468]: time="2025-02-13T15:57:44.583364308Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" returns successfully" Feb 13 15:57:44.585338 containerd[1468]: time="2025-02-13T15:57:44.585292193Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:57:44.585808 containerd[1468]: time="2025-02-13T15:57:44.585476260Z" level=info msg="TearDown network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" successfully" Feb 13 15:57:44.585808 containerd[1468]: time="2025-02-13T15:57:44.585497029Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" returns successfully" Feb 13 15:57:44.587107 containerd[1468]: time="2025-02-13T15:57:44.586890218Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:57:44.587107 containerd[1468]: time="2025-02-13T15:57:44.587036006Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:57:44.587107 containerd[1468]: time="2025-02-13T15:57:44.587051866Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:57:44.588164 containerd[1468]: time="2025-02-13T15:57:44.588104193Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:57:44.588550 containerd[1468]: time="2025-02-13T15:57:44.588428790Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:57:44.588550 containerd[1468]: time="2025-02-13T15:57:44.588471144Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:57:44.590096 containerd[1468]: time="2025-02-13T15:57:44.589919020Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:44.590096 containerd[1468]: time="2025-02-13T15:57:44.590030750Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:44.590096 containerd[1468]: time="2025-02-13T15:57:44.590042033Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:44.591930 containerd[1468]: time="2025-02-13T15:57:44.591698896Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:44.591930 containerd[1468]: time="2025-02-13T15:57:44.591881902Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:44.592932 containerd[1468]: time="2025-02-13T15:57:44.592425821Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:44.600276 containerd[1468]: time="2025-02-13T15:57:44.599848006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:8,}" Feb 13 15:57:44.637036 kubelet[1782]: I0213 15:57:44.636668 1782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-d69xk" podStartSLOduration=4.055294103 podStartE2EDuration="29.636640012s" podCreationTimestamp="2025-02-13 15:57:15 +0000 UTC" firstStartedPulling="2025-02-13 15:57:18.565056391 +0000 UTC m=+4.300387165" lastFinishedPulling="2025-02-13 15:57:44.146402288 +0000 UTC m=+29.881733074" observedRunningTime="2025-02-13 15:57:44.633000641 +0000 UTC m=+30.368331530" watchObservedRunningTime="2025-02-13 15:57:44.636640012 +0000 UTC m=+30.371970806" Feb 13 15:57:44.652976 kubelet[1782]: I0213 15:57:44.652808 1782 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762" Feb 13 15:57:44.654439 containerd[1468]: time="2025-02-13T15:57:44.653764611Z" level=info msg="StopPodSandbox for \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\"" Feb 13 15:57:44.654439 containerd[1468]: time="2025-02-13T15:57:44.654093885Z" level=info msg="Ensure that sandbox 2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762 in task-service has been cleanup successfully" Feb 13 15:57:44.664191 systemd[1]: run-netns-cni\x2d29af6d0d\x2d79a4\x2de851\x2dd817\x2dd283a988f357.mount: Deactivated successfully. Feb 13 15:57:44.668741 containerd[1468]: time="2025-02-13T15:57:44.668107626Z" level=info msg="TearDown network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\" successfully" Feb 13 15:57:44.669370 containerd[1468]: time="2025-02-13T15:57:44.668672644Z" level=info msg="StopPodSandbox for \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\" returns successfully" Feb 13 15:57:44.670617 containerd[1468]: time="2025-02-13T15:57:44.670586356Z" level=info msg="StopPodSandbox for \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\"" Feb 13 15:57:44.670979 containerd[1468]: time="2025-02-13T15:57:44.670920375Z" level=info msg="TearDown network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" successfully" Feb 13 15:57:44.671134 containerd[1468]: time="2025-02-13T15:57:44.671053798Z" level=info msg="StopPodSandbox for \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" returns successfully" Feb 13 15:57:44.675397 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 15:57:44.677829 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 15:57:44.678318 containerd[1468]: time="2025-02-13T15:57:44.678259436Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\"" Feb 13 15:57:44.678809 containerd[1468]: time="2025-02-13T15:57:44.678620158Z" level=info msg="TearDown network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" successfully" Feb 13 15:57:44.678809 containerd[1468]: time="2025-02-13T15:57:44.678740897Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" returns successfully" Feb 13 15:57:44.680061 containerd[1468]: time="2025-02-13T15:57:44.680021534Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" Feb 13 15:57:44.680387 containerd[1468]: time="2025-02-13T15:57:44.680299756Z" level=info msg="TearDown network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" successfully" Feb 13 15:57:44.680387 containerd[1468]: time="2025-02-13T15:57:44.680323482Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" returns successfully" Feb 13 15:57:44.683031 containerd[1468]: time="2025-02-13T15:57:44.681898546Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:57:44.686163 containerd[1468]: time="2025-02-13T15:57:44.685312528Z" level=info msg="TearDown network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" successfully" Feb 13 15:57:44.686163 containerd[1468]: time="2025-02-13T15:57:44.685357235Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" returns successfully" Feb 13 15:57:44.689203 containerd[1468]: time="2025-02-13T15:57:44.689141086Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:57:44.689640 containerd[1468]: time="2025-02-13T15:57:44.689585375Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:57:44.689825 containerd[1468]: time="2025-02-13T15:57:44.689725545Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:57:44.690839 containerd[1468]: time="2025-02-13T15:57:44.690779974Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:57:44.695548 containerd[1468]: time="2025-02-13T15:57:44.691921007Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:57:44.695548 containerd[1468]: time="2025-02-13T15:57:44.691953071Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:57:44.707949 containerd[1468]: time="2025-02-13T15:57:44.707892761Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:57:44.708474 containerd[1468]: time="2025-02-13T15:57:44.708423018Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:57:44.708619 containerd[1468]: time="2025-02-13T15:57:44.708602087Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:57:44.709497 containerd[1468]: time="2025-02-13T15:57:44.709455263Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:57:44.709930 containerd[1468]: time="2025-02-13T15:57:44.709903803Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:57:44.711002 containerd[1468]: time="2025-02-13T15:57:44.710037491Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:57:44.712006 containerd[1468]: time="2025-02-13T15:57:44.711687599Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:57:44.712910 containerd[1468]: time="2025-02-13T15:57:44.712511585Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:57:44.712910 containerd[1468]: time="2025-02-13T15:57:44.712852723Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:57:44.716458 containerd[1468]: time="2025-02-13T15:57:44.716178275Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:57:44.716458 containerd[1468]: time="2025-02-13T15:57:44.716328118Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:57:44.716458 containerd[1468]: time="2025-02-13T15:57:44.716345157Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:57:44.717516 containerd[1468]: time="2025-02-13T15:57:44.717261010Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:57:44.717516 containerd[1468]: time="2025-02-13T15:57:44.717391841Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:57:44.717516 containerd[1468]: time="2025-02-13T15:57:44.717408368Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:57:44.720074 containerd[1468]: time="2025-02-13T15:57:44.719858330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:12,}" Feb 13 15:57:45.006162 kubelet[1782]: E0213 15:57:45.005810 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:45.036420 containerd[1468]: time="2025-02-13T15:57:45.035447323Z" level=info msg="StopContainer for \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\" with timeout 5 (s)" Feb 13 15:57:45.038142 containerd[1468]: time="2025-02-13T15:57:45.038088513Z" level=info msg="Stop container \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\" with signal terminated" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.018 [INFO][3081] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.018 [INFO][3081] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" iface="eth0" netns="/var/run/netns/cni-c57b86f2-854a-1c1a-0e58-99d54a77c31f" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.019 [INFO][3081] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" iface="eth0" netns="/var/run/netns/cni-c57b86f2-854a-1c1a-0e58-99d54a77c31f" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.020 [INFO][3081] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" iface="eth0" netns="/var/run/netns/cni-c57b86f2-854a-1c1a-0e58-99d54a77c31f" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.020 [INFO][3081] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.020 [INFO][3081] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.093 [INFO][3107] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" HandleID="k8s-pod-network.b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" Workload="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.094 [INFO][3107] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.094 [INFO][3107] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.131 [WARNING][3107] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" HandleID="k8s-pod-network.b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" Workload="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.131 [INFO][3107] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" HandleID="k8s-pod-network.b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" Workload="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.139 [INFO][3107] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:57:45.151089 containerd[1468]: 2025-02-13 15:57:45.144 [INFO][3081] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c" Feb 13 15:57:45.162746 containerd[1468]: time="2025-02-13T15:57:45.161137373Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:8,} failed, error" error="failed to setup network for sandbox \"b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:45.163002 kubelet[1782]: E0213 15:57:45.161577 1782 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:57:45.163002 kubelet[1782]: E0213 15:57:45.161636 1782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:45.163002 kubelet[1782]: E0213 15:57:45.161706 1782 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-85f456d6dd-b2htq" Feb 13 15:57:45.163103 kubelet[1782]: E0213 15:57:45.161791 1782 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-85f456d6dd-b2htq_default(f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b20c0e978e30d4f9dc8d7b7b72fb86836a4d316da9482e5f0d14505548b0995c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-85f456d6dd-b2htq" podUID="f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d" Feb 13 15:57:45.528229 systemd-networkd[1360]: calicebde52d2f8: Link UP Feb 13 15:57:45.529844 systemd-networkd[1360]: calicebde52d2f8: Gained carrier Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:44.967 [INFO][3086] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.029 [INFO][3086] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {143.198.68.221-k8s-csi--node--driver--h6pm6-eth0 csi-node-driver- calico-system 120c5850-254f-4135-9f32-a6b8fe647e54 1041 0 2025-02-13 15:57:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s 143.198.68.221 csi-node-driver-h6pm6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calicebde52d2f8 [] []}} ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.029 [INFO][3086] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.150 [INFO][3121] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" HandleID="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Workload="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.193 [INFO][3121] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" HandleID="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Workload="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000501c0), Attrs:map[string]string{"namespace":"calico-system", "node":"143.198.68.221", "pod":"csi-node-driver-h6pm6", "timestamp":"2025-02-13 15:57:45.15041227 +0000 UTC"}, Hostname:"143.198.68.221", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.193 [INFO][3121] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.194 [INFO][3121] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.194 [INFO][3121] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '143.198.68.221' Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.205 [INFO][3121] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.221 [INFO][3121] ipam/ipam.go 372: Looking up existing affinities for host host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.249 [INFO][3121] ipam/ipam.go 521: Ran out of existing affine blocks for host host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.260 [INFO][3121] ipam/ipam.go 538: Tried all affine blocks. Looking for an affine block with space, or a new unclaimed block host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.276 [INFO][3121] ipam/ipam_block_reader_writer.go 154: Found free block: 192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.276 [INFO][3121] ipam/ipam.go 550: Found unclaimed block host="143.198.68.221" subnet=192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.276 [INFO][3121] ipam/ipam_block_reader_writer.go 171: Trying to create affinity in pending state host="143.198.68.221" subnet=192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.287 [INFO][3121] ipam/ipam_block_reader_writer.go 201: Successfully created pending affinity for block host="143.198.68.221" subnet=192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.287 [INFO][3121] ipam/ipam.go 155: Attempting to load block cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.294 [INFO][3121] ipam/ipam.go 160: The referenced block doesn't exist, trying to create it cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.300 [INFO][3121] ipam/ipam.go 167: Wrote affinity as pending cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.305 [INFO][3121] ipam/ipam.go 176: Attempting to claim the block cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.305 [INFO][3121] ipam/ipam_block_reader_writer.go 223: Attempting to create a new block host="143.198.68.221" subnet=192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.322 [INFO][3121] ipam/ipam_block_reader_writer.go 264: Successfully created block Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.322 [INFO][3121] ipam/ipam_block_reader_writer.go 275: Confirming affinity host="143.198.68.221" subnet=192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.349 [INFO][3121] ipam/ipam_block_reader_writer.go 290: Successfully confirmed affinity host="143.198.68.221" subnet=192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.349 [INFO][3121] ipam/ipam.go 585: Block '192.168.3.64/26' has 64 free ips which is more than 1 ips required. host="143.198.68.221" subnet=192.168.3.64/26 Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.349 [INFO][3121] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.355 [INFO][3121] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f Feb 13 15:57:45.561907 containerd[1468]: 2025-02-13 15:57:45.363 [INFO][3121] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567046 containerd[1468]: 2025-02-13 15:57:45.369 [ERROR][3121] ipam/customresource.go 184: Error updating resource Key=IPAMBlock(192-168-3-64-26) Name="192-168-3-64-26" Resource="IPAMBlocks" Value=&v3.IPAMBlock{TypeMeta:v1.TypeMeta{Kind:"IPAMBlock", APIVersion:"crd.projectcalico.org/v1"}, ObjectMeta:v1.ObjectMeta{Name:"192-168-3-64-26", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"1352", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.IPAMBlockSpec{CIDR:"192.168.3.64/26", Affinity:(*string)(0xc000319010), Allocations:[]*int{(*int)(0xc0001037f8), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil)}, Unallocated:[]int{1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63}, Attributes:[]v3.AllocationAttribute{v3.AllocationAttribute{AttrPrimary:(*string)(0xc0000501c0), AttrSecondary:map[string]string{"namespace":"calico-system", "node":"143.198.68.221", "pod":"csi-node-driver-h6pm6", "timestamp":"2025-02-13 15:57:45.15041227 +0000 UTC"}}}, SequenceNumber:0x1823cfb6d7bddd65, SequenceNumberForAllocation:map[string]uint64{"0":0x1823cfb6d7bddd64}, Deleted:false, DeprecatedStrictAffinity:false}} error=Operation cannot be fulfilled on ipamblocks.crd.projectcalico.org "192-168-3-64-26": the object has been modified; please apply your changes to the latest version and try again Feb 13 15:57:45.567046 containerd[1468]: 2025-02-13 15:57:45.370 [INFO][3121] ipam/ipam.go 1207: Failed to update block block=192.168.3.64/26 error=update conflict: IPAMBlock(192-168-3-64-26) handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567046 containerd[1468]: 2025-02-13 15:57:45.407 [INFO][3121] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567046 containerd[1468]: 2025-02-13 15:57:45.410 [INFO][3121] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f Feb 13 15:57:45.567046 containerd[1468]: 2025-02-13 15:57:45.418 [INFO][3121] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.425 [ERROR][3121] ipam/customresource.go 184: Error updating resource Key=IPAMBlock(192-168-3-64-26) Name="192-168-3-64-26" Resource="IPAMBlocks" Value=&v3.IPAMBlock{TypeMeta:v1.TypeMeta{Kind:"IPAMBlock", APIVersion:"crd.projectcalico.org/v1"}, ObjectMeta:v1.ObjectMeta{Name:"192-168-3-64-26", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"1354", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.IPAMBlockSpec{CIDR:"192.168.3.64/26", Affinity:(*string)(0xc000480290), Allocations:[]*int{(*int)(0xc0003a6f98), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil), (*int)(nil)}, Unallocated:[]int{1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63}, Attributes:[]v3.AllocationAttribute{v3.AllocationAttribute{AttrPrimary:(*string)(0xc0000501c0), AttrSecondary:map[string]string{"namespace":"calico-system", "node":"143.198.68.221", "pod":"csi-node-driver-h6pm6", "timestamp":"2025-02-13 15:57:45.15041227 +0000 UTC"}}}, SequenceNumber:0x1823cfb6d7bddd66, SequenceNumberForAllocation:map[string]uint64{"0":0x1823cfb6d7bddd65}, Deleted:false, DeprecatedStrictAffinity:false}} error=Operation cannot be fulfilled on ipamblocks.crd.projectcalico.org "192-168-3-64-26": the object has been modified; please apply your changes to the latest version and try again Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.426 [INFO][3121] ipam/ipam.go 1207: Failed to update block block=192.168.3.64/26 error=update conflict: IPAMBlock(192-168-3-64-26) handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.473 [INFO][3121] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.482 [INFO][3121] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.494 [INFO][3121] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.505 [INFO][3121] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.3.65/26] block=192.168.3.64/26 handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.505 [INFO][3121] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.3.65/26] handle="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" host="143.198.68.221" Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.505 [INFO][3121] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:57:45.567809 containerd[1468]: 2025-02-13 15:57:45.505 [INFO][3121] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.65/26] IPv6=[] ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" HandleID="k8s-pod-network.eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Workload="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" Feb 13 15:57:45.568677 containerd[1468]: 2025-02-13 15:57:45.508 [INFO][3086] cni-plugin/k8s.go 386: Populated endpoint ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-csi--node--driver--h6pm6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"120c5850-254f-4135-9f32-a6b8fe647e54", ResourceVersion:"1041", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"", Pod:"csi-node-driver-h6pm6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.3.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicebde52d2f8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:57:45.568677 containerd[1468]: 2025-02-13 15:57:45.508 [INFO][3086] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.3.65/32] ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" Feb 13 15:57:45.568677 containerd[1468]: 2025-02-13 15:57:45.508 [INFO][3086] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicebde52d2f8 ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" Feb 13 15:57:45.568677 containerd[1468]: 2025-02-13 15:57:45.529 [INFO][3086] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" Feb 13 15:57:45.568677 containerd[1468]: 2025-02-13 15:57:45.533 [INFO][3086] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-csi--node--driver--h6pm6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"120c5850-254f-4135-9f32-a6b8fe647e54", ResourceVersion:"1041", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f", Pod:"csi-node-driver-h6pm6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.3.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicebde52d2f8", MAC:"ea:e4:b5:12:6e:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:57:45.568677 containerd[1468]: 2025-02-13 15:57:45.555 [INFO][3086] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f" Namespace="calico-system" Pod="csi-node-driver-h6pm6" WorkloadEndpoint="143.198.68.221-k8s-csi--node--driver--h6pm6-eth0" Feb 13 15:57:45.600867 containerd[1468]: time="2025-02-13T15:57:45.600325275Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:45.600867 containerd[1468]: time="2025-02-13T15:57:45.600579917Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:45.600867 containerd[1468]: time="2025-02-13T15:57:45.600607248Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:45.600867 containerd[1468]: time="2025-02-13T15:57:45.600729850Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:45.631900 systemd[1]: run-containerd-runc-k8s.io-eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f-runc.qOY4Oo.mount: Deactivated successfully. Feb 13 15:57:45.642511 systemd[1]: Started cri-containerd-eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f.scope - libcontainer container eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f. Feb 13 15:57:45.659299 containerd[1468]: time="2025-02-13T15:57:45.658821293Z" level=info msg="StopPodSandbox for \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\"" Feb 13 15:57:45.659299 containerd[1468]: time="2025-02-13T15:57:45.658988684Z" level=info msg="TearDown network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" successfully" Feb 13 15:57:45.659299 containerd[1468]: time="2025-02-13T15:57:45.659008899Z" level=info msg="StopPodSandbox for \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" returns successfully" Feb 13 15:57:45.660289 containerd[1468]: time="2025-02-13T15:57:45.660074914Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\"" Feb 13 15:57:45.661344 containerd[1468]: time="2025-02-13T15:57:45.660427764Z" level=info msg="TearDown network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" successfully" Feb 13 15:57:45.661344 containerd[1468]: time="2025-02-13T15:57:45.661199426Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" returns successfully" Feb 13 15:57:45.662666 containerd[1468]: time="2025-02-13T15:57:45.662413703Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" Feb 13 15:57:45.662666 containerd[1468]: time="2025-02-13T15:57:45.662557660Z" level=info msg="TearDown network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" successfully" Feb 13 15:57:45.662666 containerd[1468]: time="2025-02-13T15:57:45.662574695Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" returns successfully" Feb 13 15:57:45.664779 containerd[1468]: time="2025-02-13T15:57:45.664750668Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:57:45.665180 containerd[1468]: time="2025-02-13T15:57:45.665113949Z" level=info msg="TearDown network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" successfully" Feb 13 15:57:45.665317 containerd[1468]: time="2025-02-13T15:57:45.665245913Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" returns successfully" Feb 13 15:57:45.666065 containerd[1468]: time="2025-02-13T15:57:45.665981425Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:57:45.666371 containerd[1468]: time="2025-02-13T15:57:45.666317633Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:57:45.666371 containerd[1468]: time="2025-02-13T15:57:45.666343959Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:57:45.667297 containerd[1468]: time="2025-02-13T15:57:45.666984166Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:57:45.667297 containerd[1468]: time="2025-02-13T15:57:45.667077478Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:57:45.667297 containerd[1468]: time="2025-02-13T15:57:45.667086614Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:57:45.667763 containerd[1468]: time="2025-02-13T15:57:45.667647491Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:57:45.668249 containerd[1468]: time="2025-02-13T15:57:45.668065308Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:57:45.668249 containerd[1468]: time="2025-02-13T15:57:45.668093004Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:57:45.669034 containerd[1468]: time="2025-02-13T15:57:45.668561102Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:57:45.669034 containerd[1468]: time="2025-02-13T15:57:45.668687128Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:57:45.669034 containerd[1468]: time="2025-02-13T15:57:45.668703449Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:57:45.670587 containerd[1468]: time="2025-02-13T15:57:45.670559143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:8,}" Feb 13 15:57:45.716182 containerd[1468]: time="2025-02-13T15:57:45.715969416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h6pm6,Uid:120c5850-254f-4135-9f32-a6b8fe647e54,Namespace:calico-system,Attempt:12,} returns sandbox id \"eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f\"" Feb 13 15:57:45.904261 systemd-networkd[1360]: calic7bfb027f4a: Link UP Feb 13 15:57:45.907750 systemd-networkd[1360]: calic7bfb027f4a: Gained carrier Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.738 [INFO][3193] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.766 [INFO][3193] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0 nginx-deployment-85f456d6dd- default f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d 1345 0 2025-02-13 15:57:35 +0000 UTC map[app:nginx pod-template-hash:85f456d6dd projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 143.198.68.221 nginx-deployment-85f456d6dd-b2htq eth0 default [] [] [kns.default ksa.default.default] calic7bfb027f4a [] []}} ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.766 [INFO][3193] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.813 [INFO][3205] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" HandleID="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Workload="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.838 [INFO][3205] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" HandleID="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Workload="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a2c30), Attrs:map[string]string{"namespace":"default", "node":"143.198.68.221", "pod":"nginx-deployment-85f456d6dd-b2htq", "timestamp":"2025-02-13 15:57:45.813268693 +0000 UTC"}, Hostname:"143.198.68.221", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.838 [INFO][3205] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.838 [INFO][3205] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.838 [INFO][3205] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '143.198.68.221' Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.843 [INFO][3205] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.852 [INFO][3205] ipam/ipam.go 372: Looking up existing affinities for host host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.860 [INFO][3205] ipam/ipam.go 489: Trying affinity for 192.168.3.64/26 host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.864 [INFO][3205] ipam/ipam.go 155: Attempting to load block cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.869 [INFO][3205] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.869 [INFO][3205] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.874 [INFO][3205] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2 Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.885 [INFO][3205] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.898 [INFO][3205] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.3.66/26] block=192.168.3.64/26 handle="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.898 [INFO][3205] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.3.66/26] handle="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" host="143.198.68.221" Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.898 [INFO][3205] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:57:45.933546 containerd[1468]: 2025-02-13 15:57:45.898 [INFO][3205] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.66/26] IPv6=[] ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" HandleID="k8s-pod-network.c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Workload="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.934767 containerd[1468]: 2025-02-13 15:57:45.900 [INFO][3193] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0", GenerateName:"nginx-deployment-85f456d6dd-", Namespace:"default", SelfLink:"", UID:"f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d", ResourceVersion:"1345", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"85f456d6dd", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"", Pod:"nginx-deployment-85f456d6dd-b2htq", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.3.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"calic7bfb027f4a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:57:45.934767 containerd[1468]: 2025-02-13 15:57:45.900 [INFO][3193] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.3.66/32] ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.934767 containerd[1468]: 2025-02-13 15:57:45.900 [INFO][3193] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic7bfb027f4a ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.934767 containerd[1468]: 2025-02-13 15:57:45.904 [INFO][3193] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.934767 containerd[1468]: 2025-02-13 15:57:45.906 [INFO][3193] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0", GenerateName:"nginx-deployment-85f456d6dd-", Namespace:"default", SelfLink:"", UID:"f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d", ResourceVersion:"1345", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"85f456d6dd", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2", Pod:"nginx-deployment-85f456d6dd-b2htq", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.3.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"calic7bfb027f4a", MAC:"1a:c1:ca:32:1c:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:57:45.934767 containerd[1468]: 2025-02-13 15:57:45.931 [INFO][3193] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2" Namespace="default" Pod="nginx-deployment-85f456d6dd-b2htq" WorkloadEndpoint="143.198.68.221-k8s-nginx--deployment--85f456d6dd--b2htq-eth0" Feb 13 15:57:45.978751 containerd[1468]: time="2025-02-13T15:57:45.978529120Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:45.978751 containerd[1468]: time="2025-02-13T15:57:45.978623272Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:45.978751 containerd[1468]: time="2025-02-13T15:57:45.978642443Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:45.979443 containerd[1468]: time="2025-02-13T15:57:45.978770565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:46.006675 kubelet[1782]: E0213 15:57:46.006600 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:46.013514 systemd[1]: Started cri-containerd-c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2.scope - libcontainer container c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2. Feb 13 15:57:46.126995 containerd[1468]: time="2025-02-13T15:57:46.125707203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-85f456d6dd-b2htq,Uid:f5de6aa9-e1ad-40fa-aec5-3d5311ff3b5d,Namespace:default,Attempt:8,} returns sandbox id \"c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2\"" Feb 13 15:57:46.978534 systemd-networkd[1360]: calicebde52d2f8: Gained IPv6LL Feb 13 15:57:47.007818 kubelet[1782]: E0213 15:57:47.007720 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:47.555757 systemd-networkd[1360]: calic7bfb027f4a: Gained IPv6LL Feb 13 15:57:47.567320 containerd[1468]: time="2025-02-13T15:57:47.567041480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:47.568556 containerd[1468]: time="2025-02-13T15:57:47.568370365Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=29850141" Feb 13 15:57:47.570742 containerd[1468]: time="2025-02-13T15:57:47.569588281Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:47.573285 containerd[1468]: time="2025-02-13T15:57:47.573222481Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:47.574468 containerd[1468]: time="2025-02-13T15:57:47.574417562Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 3.070566609s" Feb 13 15:57:47.574743 containerd[1468]: time="2025-02-13T15:57:47.574651785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Feb 13 15:57:47.577686 containerd[1468]: time="2025-02-13T15:57:47.577638932Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 15:57:47.602870 containerd[1468]: time="2025-02-13T15:57:47.602785480Z" level=info msg="CreateContainer within sandbox \"e93679725cd535357c73d3ab5453e2d80788af5aee96182c1232449bce842f0d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 15:57:47.643659 containerd[1468]: time="2025-02-13T15:57:47.643413633Z" level=info msg="CreateContainer within sandbox \"e93679725cd535357c73d3ab5453e2d80788af5aee96182c1232449bce842f0d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c96f570f18804d93fa429f14c4fcac7b442c592a8049784bb368096a0c3150cb\"" Feb 13 15:57:47.645219 containerd[1468]: time="2025-02-13T15:57:47.645008022Z" level=info msg="StartContainer for \"c96f570f18804d93fa429f14c4fcac7b442c592a8049784bb368096a0c3150cb\"" Feb 13 15:57:47.707745 systemd[1]: Started cri-containerd-c96f570f18804d93fa429f14c4fcac7b442c592a8049784bb368096a0c3150cb.scope - libcontainer container c96f570f18804d93fa429f14c4fcac7b442c592a8049784bb368096a0c3150cb. Feb 13 15:57:47.793088 containerd[1468]: time="2025-02-13T15:57:47.792991394Z" level=info msg="StartContainer for \"c96f570f18804d93fa429f14c4fcac7b442c592a8049784bb368096a0c3150cb\" returns successfully" Feb 13 15:57:48.008194 kubelet[1782]: E0213 15:57:48.008042 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:48.757387 kubelet[1782]: E0213 15:57:48.747988 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:48.790170 kubelet[1782]: I0213 15:57:48.789737 1782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8c7cb8495-7jdbw" podStartSLOduration=2.7156249470000002 podStartE2EDuration="5.789712705s" podCreationTimestamp="2025-02-13 15:57:43 +0000 UTC" firstStartedPulling="2025-02-13 15:57:44.503178525 +0000 UTC m=+30.238509314" lastFinishedPulling="2025-02-13 15:57:47.577266285 +0000 UTC m=+33.312597072" observedRunningTime="2025-02-13 15:57:48.789098726 +0000 UTC m=+34.524429526" watchObservedRunningTime="2025-02-13 15:57:48.789712705 +0000 UTC m=+34.525043554" Feb 13 15:57:49.009330 kubelet[1782]: E0213 15:57:49.009151 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:49.561051 containerd[1468]: time="2025-02-13T15:57:49.560971753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:49.562613 containerd[1468]: time="2025-02-13T15:57:49.562370943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 15:57:49.563869 containerd[1468]: time="2025-02-13T15:57:49.563569513Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:49.566593 containerd[1468]: time="2025-02-13T15:57:49.566516042Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:49.567724 containerd[1468]: time="2025-02-13T15:57:49.567678737Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.989488901s" Feb 13 15:57:49.567838 containerd[1468]: time="2025-02-13T15:57:49.567729482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 15:57:49.570694 containerd[1468]: time="2025-02-13T15:57:49.570637451Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 15:57:49.572095 containerd[1468]: time="2025-02-13T15:57:49.572012792Z" level=info msg="CreateContainer within sandbox \"eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 15:57:49.612880 containerd[1468]: time="2025-02-13T15:57:49.612565498Z" level=info msg="CreateContainer within sandbox \"eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d986ab01f60b5fa25ba0ec4a3d38ad4e0643333e8f3771888ec7ed89cd7182aa\"" Feb 13 15:57:49.613443 containerd[1468]: time="2025-02-13T15:57:49.613400377Z" level=info msg="StartContainer for \"d986ab01f60b5fa25ba0ec4a3d38ad4e0643333e8f3771888ec7ed89cd7182aa\"" Feb 13 15:57:49.682872 systemd[1]: Started cri-containerd-d986ab01f60b5fa25ba0ec4a3d38ad4e0643333e8f3771888ec7ed89cd7182aa.scope - libcontainer container d986ab01f60b5fa25ba0ec4a3d38ad4e0643333e8f3771888ec7ed89cd7182aa. Feb 13 15:57:49.766545 containerd[1468]: time="2025-02-13T15:57:49.766149668Z" level=info msg="StartContainer for \"d986ab01f60b5fa25ba0ec4a3d38ad4e0643333e8f3771888ec7ed89cd7182aa\" returns successfully" Feb 13 15:57:49.768461 kubelet[1782]: I0213 15:57:49.768360 1782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 15:57:49.769624 kubelet[1782]: E0213 15:57:49.769532 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:50.009795 kubelet[1782]: E0213 15:57:50.009721 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:50.074928 containerd[1468]: time="2025-02-13T15:57:50.074804634Z" level=info msg="Kill container \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\"" Feb 13 15:57:50.094427 systemd[1]: cri-containerd-5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc.scope: Deactivated successfully. Feb 13 15:57:50.094678 systemd[1]: cri-containerd-5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc.scope: Consumed 1.619s CPU time. Feb 13 15:57:50.100294 update_engine[1454]: I20250213 15:57:50.099236 1454 update_attempter.cc:509] Updating boot flags... Feb 13 15:57:50.139329 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc-rootfs.mount: Deactivated successfully. Feb 13 15:57:50.143539 containerd[1468]: time="2025-02-13T15:57:50.142203793Z" level=info msg="shim disconnected" id=5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc namespace=k8s.io Feb 13 15:57:50.143539 containerd[1468]: time="2025-02-13T15:57:50.142268736Z" level=warning msg="cleaning up after shim disconnected" id=5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc namespace=k8s.io Feb 13 15:57:50.143539 containerd[1468]: time="2025-02-13T15:57:50.142280131Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:57:50.176209 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (3511) Feb 13 15:57:50.181965 containerd[1468]: time="2025-02-13T15:57:50.181302427Z" level=warning msg="cleanup warnings time=\"2025-02-13T15:57:50Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Feb 13 15:57:50.394336 containerd[1468]: time="2025-02-13T15:57:50.394209238Z" level=info msg="StopContainer for \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\" returns successfully" Feb 13 15:57:50.396783 containerd[1468]: time="2025-02-13T15:57:50.396391729Z" level=info msg="StopPodSandbox for \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\"" Feb 13 15:57:50.396783 containerd[1468]: time="2025-02-13T15:57:50.396451395Z" level=info msg="Container to stop \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 13 15:57:50.396783 containerd[1468]: time="2025-02-13T15:57:50.396502980Z" level=info msg="Container to stop \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 13 15:57:50.396783 containerd[1468]: time="2025-02-13T15:57:50.396516053Z" level=info msg="Container to stop \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 13 15:57:50.404036 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078-shm.mount: Deactivated successfully. Feb 13 15:57:50.424358 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (3517) Feb 13 15:57:50.424663 systemd[1]: cri-containerd-7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078.scope: Deactivated successfully. Feb 13 15:57:50.514781 containerd[1468]: time="2025-02-13T15:57:50.514611861Z" level=info msg="shim disconnected" id=7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078 namespace=k8s.io Feb 13 15:57:50.515046 containerd[1468]: time="2025-02-13T15:57:50.515020800Z" level=warning msg="cleaning up after shim disconnected" id=7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078 namespace=k8s.io Feb 13 15:57:50.515301 containerd[1468]: time="2025-02-13T15:57:50.515176637Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:57:50.545595 containerd[1468]: time="2025-02-13T15:57:50.545322333Z" level=info msg="TearDown network for sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" successfully" Feb 13 15:57:50.545595 containerd[1468]: time="2025-02-13T15:57:50.545534269Z" level=info msg="StopPodSandbox for \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" returns successfully" Feb 13 15:57:50.604209 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078-rootfs.mount: Deactivated successfully. Feb 13 15:57:50.621716 kubelet[1782]: I0213 15:57:50.621655 1782 topology_manager.go:215] "Topology Admit Handler" podUID="f1356cb5-cc17-43dc-9897-98c782399a18" podNamespace="calico-system" podName="calico-node-2vqgn" Feb 13 15:57:50.622821 kubelet[1782]: E0213 15:57:50.622062 1782 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="900cfedd-57bb-4899-8255-b9e43c2da5e5" containerName="install-cni" Feb 13 15:57:50.622821 kubelet[1782]: E0213 15:57:50.622092 1782 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="900cfedd-57bb-4899-8255-b9e43c2da5e5" containerName="calico-node" Feb 13 15:57:50.622821 kubelet[1782]: E0213 15:57:50.622103 1782 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="900cfedd-57bb-4899-8255-b9e43c2da5e5" containerName="flexvol-driver" Feb 13 15:57:50.622821 kubelet[1782]: I0213 15:57:50.622169 1782 memory_manager.go:354] "RemoveStaleState removing state" podUID="900cfedd-57bb-4899-8255-b9e43c2da5e5" containerName="calico-node" Feb 13 15:57:50.638516 systemd[1]: Created slice kubepods-besteffort-podf1356cb5_cc17_43dc_9897_98c782399a18.slice - libcontainer container kubepods-besteffort-podf1356cb5_cc17_43dc_9897_98c782399a18.slice. Feb 13 15:57:50.702193 kubelet[1782]: I0213 15:57:50.701061 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-run-calico\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702193 kubelet[1782]: I0213 15:57:50.701150 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-bin-dir\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702193 kubelet[1782]: I0213 15:57:50.701185 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-log-dir\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702193 kubelet[1782]: I0213 15:57:50.701212 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-lib-calico\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702193 kubelet[1782]: I0213 15:57:50.701247 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rd9n\" (UniqueName: \"kubernetes.io/projected/900cfedd-57bb-4899-8255-b9e43c2da5e5-kube-api-access-5rd9n\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702193 kubelet[1782]: I0213 15:57:50.701300 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/900cfedd-57bb-4899-8255-b9e43c2da5e5-node-certs\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702639 kubelet[1782]: I0213 15:57:50.701327 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-lib-modules\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702639 kubelet[1782]: I0213 15:57:50.701351 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-net-dir\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702639 kubelet[1782]: I0213 15:57:50.701378 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-flexvol-driver-host\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702639 kubelet[1782]: I0213 15:57:50.701411 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/900cfedd-57bb-4899-8255-b9e43c2da5e5-tigera-ca-bundle\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702639 kubelet[1782]: I0213 15:57:50.701435 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-policysync\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702639 kubelet[1782]: I0213 15:57:50.701457 1782 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-xtables-lock\") pod \"900cfedd-57bb-4899-8255-b9e43c2da5e5\" (UID: \"900cfedd-57bb-4899-8255-b9e43c2da5e5\") " Feb 13 15:57:50.702871 kubelet[1782]: I0213 15:57:50.701577 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.702871 kubelet[1782]: I0213 15:57:50.701643 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.702871 kubelet[1782]: I0213 15:57:50.701678 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.702871 kubelet[1782]: I0213 15:57:50.701704 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.702871 kubelet[1782]: I0213 15:57:50.701731 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.703853 kubelet[1782]: I0213 15:57:50.703171 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.710211 kubelet[1782]: I0213 15:57:50.708217 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.714937 systemd[1]: var-lib-kubelet-pods-900cfedd\x2d57bb\x2d4899\x2d8255\x2db9e43c2da5e5-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5rd9n.mount: Deactivated successfully. Feb 13 15:57:50.719608 kubelet[1782]: I0213 15:57:50.719492 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900cfedd-57bb-4899-8255-b9e43c2da5e5-node-certs" (OuterVolumeSpecName: "node-certs") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 13 15:57:50.720964 kubelet[1782]: I0213 15:57:50.720775 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900cfedd-57bb-4899-8255-b9e43c2da5e5-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 13 15:57:50.720964 kubelet[1782]: I0213 15:57:50.720866 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-policysync" (OuterVolumeSpecName: "policysync") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.720964 kubelet[1782]: I0213 15:57:50.720960 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900cfedd-57bb-4899-8255-b9e43c2da5e5-kube-api-access-5rd9n" (OuterVolumeSpecName: "kube-api-access-5rd9n") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "kube-api-access-5rd9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 13 15:57:50.722282 kubelet[1782]: I0213 15:57:50.721283 1782 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "900cfedd-57bb-4899-8255-b9e43c2da5e5" (UID: "900cfedd-57bb-4899-8255-b9e43c2da5e5"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 15:57:50.722172 systemd[1]: var-lib-kubelet-pods-900cfedd\x2d57bb\x2d4899\x2d8255\x2db9e43c2da5e5-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Feb 13 15:57:50.726082 systemd[1]: var-lib-kubelet-pods-900cfedd\x2d57bb\x2d4899\x2d8255\x2db9e43c2da5e5-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Feb 13 15:57:50.789716 kubelet[1782]: I0213 15:57:50.786379 1782 scope.go:117] "RemoveContainer" containerID="5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc" Feb 13 15:57:50.792557 containerd[1468]: time="2025-02-13T15:57:50.792178799Z" level=info msg="RemoveContainer for \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\"" Feb 13 15:57:50.798495 systemd[1]: Removed slice kubepods-besteffort-pod900cfedd_57bb_4899_8255_b9e43c2da5e5.slice - libcontainer container kubepods-besteffort-pod900cfedd_57bb_4899_8255_b9e43c2da5e5.slice. Feb 13 15:57:50.798663 systemd[1]: kubepods-besteffort-pod900cfedd_57bb_4899_8255_b9e43c2da5e5.slice: Consumed 2.642s CPU time. Feb 13 15:57:50.801812 kubelet[1782]: I0213 15:57:50.801764 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-policysync\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802023 kubelet[1782]: I0213 15:57:50.801808 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-cni-bin-dir\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802023 kubelet[1782]: I0213 15:57:50.801894 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-lib-modules\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802023 kubelet[1782]: I0213 15:57:50.801923 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-cni-net-dir\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802023 kubelet[1782]: I0213 15:57:50.801970 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-cni-log-dir\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802023 kubelet[1782]: I0213 15:57:50.801992 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-flexvol-driver-host\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802280 kubelet[1782]: I0213 15:57:50.802011 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1356cb5-cc17-43dc-9897-98c782399a18-tigera-ca-bundle\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802280 kubelet[1782]: I0213 15:57:50.802051 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-var-lib-calico\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802280 kubelet[1782]: I0213 15:57:50.802076 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-var-run-calico\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802751 kubelet[1782]: I0213 15:57:50.802571 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bptrw\" (UniqueName: \"kubernetes.io/projected/f1356cb5-cc17-43dc-9897-98c782399a18-kube-api-access-bptrw\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802751 kubelet[1782]: I0213 15:57:50.802660 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f1356cb5-cc17-43dc-9897-98c782399a18-xtables-lock\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802751 kubelet[1782]: I0213 15:57:50.802723 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f1356cb5-cc17-43dc-9897-98c782399a18-node-certs\") pod \"calico-node-2vqgn\" (UID: \"f1356cb5-cc17-43dc-9897-98c782399a18\") " pod="calico-system/calico-node-2vqgn" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802797 1782 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-flexvol-driver-host\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802817 1782 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-policysync\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802831 1782 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-run-calico\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802876 1782 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-log-dir\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802890 1782 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/900cfedd-57bb-4899-8255-b9e43c2da5e5-node-certs\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802902 1782 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-lib-modules\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802915 1782 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-net-dir\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.802973 kubelet[1782]: I0213 15:57:50.802962 1782 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-var-lib-calico\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.803294 kubelet[1782]: I0213 15:57:50.802984 1782 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-5rd9n\" (UniqueName: \"kubernetes.io/projected/900cfedd-57bb-4899-8255-b9e43c2da5e5-kube-api-access-5rd9n\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.803294 kubelet[1782]: I0213 15:57:50.802993 1782 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/900cfedd-57bb-4899-8255-b9e43c2da5e5-tigera-ca-bundle\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.803294 kubelet[1782]: I0213 15:57:50.803005 1782 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-xtables-lock\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.803294 kubelet[1782]: I0213 15:57:50.803046 1782 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/900cfedd-57bb-4899-8255-b9e43c2da5e5-cni-bin-dir\") on node \"143.198.68.221\" DevicePath \"\"" Feb 13 15:57:50.810574 containerd[1468]: time="2025-02-13T15:57:50.810502130Z" level=info msg="RemoveContainer for \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\" returns successfully" Feb 13 15:57:50.811894 kubelet[1782]: I0213 15:57:50.811620 1782 scope.go:117] "RemoveContainer" containerID="f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e" Feb 13 15:57:50.815036 containerd[1468]: time="2025-02-13T15:57:50.814833642Z" level=info msg="RemoveContainer for \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\"" Feb 13 15:57:50.824091 containerd[1468]: time="2025-02-13T15:57:50.824010112Z" level=info msg="RemoveContainer for \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\" returns successfully" Feb 13 15:57:50.824446 kubelet[1782]: I0213 15:57:50.824394 1782 scope.go:117] "RemoveContainer" containerID="7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf" Feb 13 15:57:50.828616 containerd[1468]: time="2025-02-13T15:57:50.826443470Z" level=info msg="RemoveContainer for \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\"" Feb 13 15:57:50.831713 containerd[1468]: time="2025-02-13T15:57:50.831648098Z" level=info msg="RemoveContainer for \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\" returns successfully" Feb 13 15:57:50.832402 kubelet[1782]: I0213 15:57:50.832361 1782 scope.go:117] "RemoveContainer" containerID="5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc" Feb 13 15:57:50.832912 containerd[1468]: time="2025-02-13T15:57:50.832844267Z" level=error msg="ContainerStatus for \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\": not found" Feb 13 15:57:50.833644 kubelet[1782]: E0213 15:57:50.833205 1782 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\": not found" containerID="5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc" Feb 13 15:57:50.833644 kubelet[1782]: I0213 15:57:50.833253 1782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc"} err="failed to get container status \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\": rpc error: code = NotFound desc = an error occurred when try to find container \"5750f30b9c04b4206bf1eb9e9f84520274510ba4ab06ccc74e801c72fb52d9bc\": not found" Feb 13 15:57:50.833644 kubelet[1782]: I0213 15:57:50.833325 1782 scope.go:117] "RemoveContainer" containerID="f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e" Feb 13 15:57:50.834004 containerd[1468]: time="2025-02-13T15:57:50.833844349Z" level=error msg="ContainerStatus for \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\": not found" Feb 13 15:57:50.834465 kubelet[1782]: E0213 15:57:50.834294 1782 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\": not found" containerID="f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e" Feb 13 15:57:50.834537 kubelet[1782]: I0213 15:57:50.834474 1782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e"} err="failed to get container status \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\": rpc error: code = NotFound desc = an error occurred when try to find container \"f6eeb7ef76e2e6ccd38108b6bbc3258be8f9d6c170de299aa17179aa28ff834e\": not found" Feb 13 15:57:50.834537 kubelet[1782]: I0213 15:57:50.834498 1782 scope.go:117] "RemoveContainer" containerID="7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf" Feb 13 15:57:50.834774 containerd[1468]: time="2025-02-13T15:57:50.834729743Z" level=error msg="ContainerStatus for \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\": not found" Feb 13 15:57:50.834991 kubelet[1782]: E0213 15:57:50.834928 1782 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\": not found" containerID="7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf" Feb 13 15:57:50.834991 kubelet[1782]: I0213 15:57:50.834953 1782 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf"} err="failed to get container status \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\": rpc error: code = NotFound desc = an error occurred when try to find container \"7ecfed694940015de018f72e0276945298909249296392fd3689534c67dcb1cf\": not found" Feb 13 15:57:50.949168 kubelet[1782]: E0213 15:57:50.947572 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:50.951955 containerd[1468]: time="2025-02-13T15:57:50.951892240Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2vqgn,Uid:f1356cb5-cc17-43dc-9897-98c782399a18,Namespace:calico-system,Attempt:0,}" Feb 13 15:57:51.012663 kubelet[1782]: E0213 15:57:51.010747 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:51.019913 containerd[1468]: time="2025-02-13T15:57:51.018766137Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:51.019913 containerd[1468]: time="2025-02-13T15:57:51.019859711Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:51.020332 containerd[1468]: time="2025-02-13T15:57:51.019889745Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:51.027717 containerd[1468]: time="2025-02-13T15:57:51.024095185Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:51.061561 systemd[1]: Started cri-containerd-6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99.scope - libcontainer container 6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99. Feb 13 15:57:51.095801 kubelet[1782]: I0213 15:57:51.095726 1782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="900cfedd-57bb-4899-8255-b9e43c2da5e5" path="/var/lib/kubelet/pods/900cfedd-57bb-4899-8255-b9e43c2da5e5/volumes" Feb 13 15:57:51.126860 containerd[1468]: time="2025-02-13T15:57:51.126785340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2vqgn,Uid:f1356cb5-cc17-43dc-9897-98c782399a18,Namespace:calico-system,Attempt:0,} returns sandbox id \"6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99\"" Feb 13 15:57:51.128319 kubelet[1782]: E0213 15:57:51.128287 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:51.131429 containerd[1468]: time="2025-02-13T15:57:51.131375590Z" level=info msg="CreateContainer within sandbox \"6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 15:57:51.153384 containerd[1468]: time="2025-02-13T15:57:51.153010062Z" level=info msg="CreateContainer within sandbox \"6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d\"" Feb 13 15:57:51.155203 containerd[1468]: time="2025-02-13T15:57:51.154457566Z" level=info msg="StartContainer for \"6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d\"" Feb 13 15:57:51.219542 systemd[1]: Started cri-containerd-6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d.scope - libcontainer container 6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d. Feb 13 15:57:51.384266 containerd[1468]: time="2025-02-13T15:57:51.384026144Z" level=info msg="StartContainer for \"6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d\" returns successfully" Feb 13 15:57:51.490898 systemd[1]: cri-containerd-6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d.scope: Deactivated successfully. Feb 13 15:57:51.661767 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d-rootfs.mount: Deactivated successfully. Feb 13 15:57:51.758027 containerd[1468]: time="2025-02-13T15:57:51.757928094Z" level=info msg="shim disconnected" id=6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d namespace=k8s.io Feb 13 15:57:51.758027 containerd[1468]: time="2025-02-13T15:57:51.758008314Z" level=warning msg="cleaning up after shim disconnected" id=6aeae560307cdfdf0e493c40300097b6222b65b410c28c2547bacef48b50552d namespace=k8s.io Feb 13 15:57:51.758432 containerd[1468]: time="2025-02-13T15:57:51.758022812Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:57:51.807573 kubelet[1782]: E0213 15:57:51.806970 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:52.012333 kubelet[1782]: E0213 15:57:52.010907 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:52.756371 kubelet[1782]: I0213 15:57:52.755113 1782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 15:57:52.758047 kubelet[1782]: E0213 15:57:52.757862 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:52.845687 kubelet[1782]: E0213 15:57:52.845635 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:52.846879 kubelet[1782]: E0213 15:57:52.845677 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:52.857188 containerd[1468]: time="2025-02-13T15:57:52.857069855Z" level=info msg="CreateContainer within sandbox \"6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 15:57:52.896444 containerd[1468]: time="2025-02-13T15:57:52.894325896Z" level=info msg="CreateContainer within sandbox \"6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c\"" Feb 13 15:57:52.896444 containerd[1468]: time="2025-02-13T15:57:52.895560867Z" level=info msg="StartContainer for \"5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c\"" Feb 13 15:57:53.012155 systemd[1]: Started cri-containerd-5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c.scope - libcontainer container 5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c. Feb 13 15:57:53.012945 kubelet[1782]: E0213 15:57:53.012735 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:53.127638 containerd[1468]: time="2025-02-13T15:57:53.123844743Z" level=info msg="StartContainer for \"5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c\" returns successfully" Feb 13 15:57:53.867523 kubelet[1782]: E0213 15:57:53.866884 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:54.013451 kubelet[1782]: E0213 15:57:54.013302 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:54.859777 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3260169098.mount: Deactivated successfully. Feb 13 15:57:54.890772 kubelet[1782]: E0213 15:57:54.889699 1782 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:55.018321 kubelet[1782]: E0213 15:57:55.018161 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:55.179044 systemd[1]: cri-containerd-5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c.scope: Deactivated successfully. Feb 13 15:57:55.180330 systemd[1]: cri-containerd-5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c.scope: Consumed 1.055s CPU time. Feb 13 15:57:55.274205 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c-rootfs.mount: Deactivated successfully. Feb 13 15:57:55.483680 containerd[1468]: time="2025-02-13T15:57:55.483296174Z" level=info msg="shim disconnected" id=5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c namespace=k8s.io Feb 13 15:57:55.483680 containerd[1468]: time="2025-02-13T15:57:55.483378559Z" level=warning msg="cleaning up after shim disconnected" id=5e062e130b1b41f2597f38cd4b6ebcab8efa976a6977718470bdd559c68b889c namespace=k8s.io Feb 13 15:57:55.483680 containerd[1468]: time="2025-02-13T15:57:55.483391199Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:57:55.522829 containerd[1468]: time="2025-02-13T15:57:55.514251143Z" level=warning msg="cleanup warnings time=\"2025-02-13T15:57:55Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Feb 13 15:57:55.890421 kubelet[1782]: E0213 15:57:55.890262 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:55.931978 containerd[1468]: time="2025-02-13T15:57:55.931885201Z" level=info msg="CreateContainer within sandbox \"6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 15:57:55.973327 containerd[1468]: time="2025-02-13T15:57:55.973266712Z" level=info msg="CreateContainer within sandbox \"6c1cdfb23bb8ad9931432a97d8bc42b4d41360b2fb80fb859785ee4daf083a99\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fe4bfcc13bba6ff91931e6d4041779d053f4079dfe43ee1778e58a7f5a7381d4\"" Feb 13 15:57:55.978155 containerd[1468]: time="2025-02-13T15:57:55.976823771Z" level=info msg="StartContainer for \"fe4bfcc13bba6ff91931e6d4041779d053f4079dfe43ee1778e58a7f5a7381d4\"" Feb 13 15:57:56.019214 kubelet[1782]: E0213 15:57:56.019087 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:56.082632 systemd[1]: Started cri-containerd-fe4bfcc13bba6ff91931e6d4041779d053f4079dfe43ee1778e58a7f5a7381d4.scope - libcontainer container fe4bfcc13bba6ff91931e6d4041779d053f4079dfe43ee1778e58a7f5a7381d4. Feb 13 15:57:56.166172 containerd[1468]: time="2025-02-13T15:57:56.166075324Z" level=info msg="StartContainer for \"fe4bfcc13bba6ff91931e6d4041779d053f4079dfe43ee1778e58a7f5a7381d4\" returns successfully" Feb 13 15:57:56.899582 kubelet[1782]: E0213 15:57:56.899530 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:56.917623 systemd[1]: run-containerd-runc-k8s.io-fe4bfcc13bba6ff91931e6d4041779d053f4079dfe43ee1778e58a7f5a7381d4-runc.ebt1u7.mount: Deactivated successfully. Feb 13 15:57:56.967112 kubelet[1782]: I0213 15:57:56.966210 1782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-2vqgn" podStartSLOduration=6.966086941 podStartE2EDuration="6.966086941s" podCreationTimestamp="2025-02-13 15:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 15:57:56.957889755 +0000 UTC m=+42.693220555" watchObservedRunningTime="2025-02-13 15:57:56.966086941 +0000 UTC m=+42.701417738" Feb 13 15:57:57.019489 kubelet[1782]: E0213 15:57:57.019298 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:57.905814 kubelet[1782]: E0213 15:57:57.905717 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:57:58.020083 kubelet[1782]: E0213 15:57:58.019992 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:58.288943 containerd[1468]: time="2025-02-13T15:57:58.288621594Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:58.289827 containerd[1468]: time="2025-02-13T15:57:58.289324978Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=73054493" Feb 13 15:57:58.294870 containerd[1468]: time="2025-02-13T15:57:58.294804048Z" level=info msg="ImageCreate event name:\"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:58.305484 containerd[1468]: time="2025-02-13T15:57:58.305298964Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:58.310555 containerd[1468]: time="2025-02-13T15:57:58.310481011Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 8.739462609s" Feb 13 15:57:58.310555 containerd[1468]: time="2025-02-13T15:57:58.310540167Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 15:57:58.314998 containerd[1468]: time="2025-02-13T15:57:58.314924342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 15:57:58.331759 containerd[1468]: time="2025-02-13T15:57:58.331663772Z" level=info msg="CreateContainer within sandbox \"c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2\" for container &ContainerMetadata{Name:nginx,Attempt:0,}" Feb 13 15:57:58.373310 containerd[1468]: time="2025-02-13T15:57:58.371306722Z" level=info msg="CreateContainer within sandbox \"c5b0bc26cc0f14a3998b64937e450dc1b52ccb560cbb0f98d3929bcf87d74ad2\" for &ContainerMetadata{Name:nginx,Attempt:0,} returns container id \"6b58ef7abd93c1acd615fe72077883c72211cb6539bdb76873adcee52826bdb7\"" Feb 13 15:57:58.376748 containerd[1468]: time="2025-02-13T15:57:58.374042427Z" level=info msg="StartContainer for \"6b58ef7abd93c1acd615fe72077883c72211cb6539bdb76873adcee52826bdb7\"" Feb 13 15:57:58.494475 systemd[1]: Started cri-containerd-6b58ef7abd93c1acd615fe72077883c72211cb6539bdb76873adcee52826bdb7.scope - libcontainer container 6b58ef7abd93c1acd615fe72077883c72211cb6539bdb76873adcee52826bdb7. Feb 13 15:57:58.591582 containerd[1468]: time="2025-02-13T15:57:58.590637487Z" level=info msg="StartContainer for \"6b58ef7abd93c1acd615fe72077883c72211cb6539bdb76873adcee52826bdb7\" returns successfully" Feb 13 15:57:58.673924 kernel: bpftool[4041]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 15:57:58.969230 kubelet[1782]: I0213 15:57:58.969107 1782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nginx-deployment-85f456d6dd-b2htq" podStartSLOduration=11.787423938 podStartE2EDuration="23.969064849s" podCreationTimestamp="2025-02-13 15:57:35 +0000 UTC" firstStartedPulling="2025-02-13 15:57:46.131412759 +0000 UTC m=+31.866743545" lastFinishedPulling="2025-02-13 15:57:58.31305368 +0000 UTC m=+44.048384456" observedRunningTime="2025-02-13 15:57:58.967082302 +0000 UTC m=+44.702413114" watchObservedRunningTime="2025-02-13 15:57:58.969064849 +0000 UTC m=+44.704395659" Feb 13 15:57:59.020858 kubelet[1782]: E0213 15:57:59.020766 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:57:59.225737 systemd-networkd[1360]: vxlan.calico: Link UP Feb 13 15:57:59.225753 systemd-networkd[1360]: vxlan.calico: Gained carrier Feb 13 15:58:00.021086 kubelet[1782]: E0213 15:58:00.021017 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:00.353683 systemd-networkd[1360]: vxlan.calico: Gained IPv6LL Feb 13 15:58:00.398032 containerd[1468]: time="2025-02-13T15:58:00.397951769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:00.400551 containerd[1468]: time="2025-02-13T15:58:00.400369706Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 15:58:00.403874 containerd[1468]: time="2025-02-13T15:58:00.403804188Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:00.419343 containerd[1468]: time="2025-02-13T15:58:00.410648890Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:00.419343 containerd[1468]: time="2025-02-13T15:58:00.412313797Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.097326106s" Feb 13 15:58:00.419343 containerd[1468]: time="2025-02-13T15:58:00.412375623Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 15:58:00.419934 containerd[1468]: time="2025-02-13T15:58:00.419840965Z" level=info msg="CreateContainer within sandbox \"eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 15:58:00.458809 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4125587855.mount: Deactivated successfully. Feb 13 15:58:00.485339 containerd[1468]: time="2025-02-13T15:58:00.485264925Z" level=info msg="CreateContainer within sandbox \"eeee21210b0243242b618fc9e42cc0a4bd54d8e7fc4c05f2efc3f29bef33738f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"6272e07079d85c17ec9f226bf1ae54965d70e785b079fbc8e3c223d0a2838221\"" Feb 13 15:58:00.486416 containerd[1468]: time="2025-02-13T15:58:00.486363991Z" level=info msg="StartContainer for \"6272e07079d85c17ec9f226bf1ae54965d70e785b079fbc8e3c223d0a2838221\"" Feb 13 15:58:00.566655 systemd[1]: Started cri-containerd-6272e07079d85c17ec9f226bf1ae54965d70e785b079fbc8e3c223d0a2838221.scope - libcontainer container 6272e07079d85c17ec9f226bf1ae54965d70e785b079fbc8e3c223d0a2838221. Feb 13 15:58:00.655246 containerd[1468]: time="2025-02-13T15:58:00.654775445Z" level=info msg="StartContainer for \"6272e07079d85c17ec9f226bf1ae54965d70e785b079fbc8e3c223d0a2838221\" returns successfully" Feb 13 15:58:01.012554 kubelet[1782]: I0213 15:58:01.011915 1782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-h6pm6" podStartSLOduration=31.316203128 podStartE2EDuration="46.011890094s" podCreationTimestamp="2025-02-13 15:57:15 +0000 UTC" firstStartedPulling="2025-02-13 15:57:45.718637947 +0000 UTC m=+31.453968717" lastFinishedPulling="2025-02-13 15:58:00.414324894 +0000 UTC m=+46.149655683" observedRunningTime="2025-02-13 15:58:01.010065747 +0000 UTC m=+46.745396545" watchObservedRunningTime="2025-02-13 15:58:01.011890094 +0000 UTC m=+46.747220866" Feb 13 15:58:01.022194 kubelet[1782]: E0213 15:58:01.022067 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:01.158357 kubelet[1782]: I0213 15:58:01.157623 1782 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 15:58:01.158357 kubelet[1782]: I0213 15:58:01.157706 1782 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 15:58:02.022568 kubelet[1782]: E0213 15:58:02.022392 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:03.023759 kubelet[1782]: E0213 15:58:03.023657 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:04.024163 kubelet[1782]: E0213 15:58:04.024044 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:05.025303 kubelet[1782]: E0213 15:58:05.025209 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:06.026954 kubelet[1782]: E0213 15:58:06.026514 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:07.027691 kubelet[1782]: E0213 15:58:07.027575 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:08.028021 kubelet[1782]: E0213 15:58:08.027941 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:09.028454 kubelet[1782]: E0213 15:58:09.028381 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:09.708153 kubelet[1782]: I0213 15:58:09.708043 1782 topology_manager.go:215] "Topology Admit Handler" podUID="7d0f2e26-e4cd-45e2-b358-715760250d8c" podNamespace="default" podName="nfs-server-provisioner-0" Feb 13 15:58:09.731891 systemd[1]: Created slice kubepods-besteffort-pod7d0f2e26_e4cd_45e2_b358_715760250d8c.slice - libcontainer container kubepods-besteffort-pod7d0f2e26_e4cd_45e2_b358_715760250d8c.slice. Feb 13 15:58:09.812534 kubelet[1782]: I0213 15:58:09.812283 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7d0f2e26-e4cd-45e2-b358-715760250d8c-data\") pod \"nfs-server-provisioner-0\" (UID: \"7d0f2e26-e4cd-45e2-b358-715760250d8c\") " pod="default/nfs-server-provisioner-0" Feb 13 15:58:09.812534 kubelet[1782]: I0213 15:58:09.812363 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvt5z\" (UniqueName: \"kubernetes.io/projected/7d0f2e26-e4cd-45e2-b358-715760250d8c-kube-api-access-qvt5z\") pod \"nfs-server-provisioner-0\" (UID: \"7d0f2e26-e4cd-45e2-b358-715760250d8c\") " pod="default/nfs-server-provisioner-0" Feb 13 15:58:10.029527 kubelet[1782]: E0213 15:58:10.029314 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:10.047976 containerd[1468]: time="2025-02-13T15:58:10.047855699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:7d0f2e26-e4cd-45e2-b358-715760250d8c,Namespace:default,Attempt:0,}" Feb 13 15:58:10.503206 systemd-networkd[1360]: cali60e51b789ff: Link UP Feb 13 15:58:10.516863 systemd-networkd[1360]: cali60e51b789ff: Gained carrier Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.309 [INFO][4183] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {143.198.68.221-k8s-nfs--server--provisioner--0-eth0 nfs-server-provisioner- default 7d0f2e26-e4cd-45e2-b358-715760250d8c 1535 0 2025-02-13 15:58:09 +0000 UTC map[app:nfs-server-provisioner apps.kubernetes.io/pod-index:0 chart:nfs-server-provisioner-1.8.0 controller-revision-hash:nfs-server-provisioner-d5cbb7f57 heritage:Helm projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:nfs-server-provisioner release:nfs-server-provisioner statefulset.kubernetes.io/pod-name:nfs-server-provisioner-0] map[] [] [] []} {k8s 143.198.68.221 nfs-server-provisioner-0 eth0 nfs-server-provisioner [] [] [kns.default ksa.default.nfs-server-provisioner] cali60e51b789ff [{nfs TCP 2049 0 } {nfs-udp UDP 2049 0 } {nlockmgr TCP 32803 0 } {nlockmgr-udp UDP 32803 0 } {mountd TCP 20048 0 } {mountd-udp UDP 20048 0 } {rquotad TCP 875 0 } {rquotad-udp UDP 875 0 } {rpcbind TCP 111 0 } {rpcbind-udp UDP 111 0 } {statd TCP 662 0 } {statd-udp UDP 662 0 }] []}} ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.310 [INFO][4183] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.372 [INFO][4193] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" HandleID="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Workload="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.393 [INFO][4193] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" HandleID="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Workload="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000310b50), Attrs:map[string]string{"namespace":"default", "node":"143.198.68.221", "pod":"nfs-server-provisioner-0", "timestamp":"2025-02-13 15:58:10.372788286 +0000 UTC"}, Hostname:"143.198.68.221", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.393 [INFO][4193] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.393 [INFO][4193] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.394 [INFO][4193] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '143.198.68.221' Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.399 [INFO][4193] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.412 [INFO][4193] ipam/ipam.go 372: Looking up existing affinities for host host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.423 [INFO][4193] ipam/ipam.go 489: Trying affinity for 192.168.3.64/26 host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.430 [INFO][4193] ipam/ipam.go 155: Attempting to load block cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.437 [INFO][4193] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.437 [INFO][4193] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.443 [INFO][4193] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59 Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.459 [INFO][4193] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.488 [INFO][4193] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.3.67/26] block=192.168.3.64/26 handle="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.488 [INFO][4193] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.3.67/26] handle="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" host="143.198.68.221" Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.488 [INFO][4193] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:10.565430 containerd[1468]: 2025-02-13 15:58:10.488 [INFO][4193] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.67/26] IPv6=[] ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" HandleID="k8s-pod-network.a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Workload="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" Feb 13 15:58:10.567313 containerd[1468]: 2025-02-13 15:58:10.493 [INFO][4183] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"7d0f2e26-e4cd-45e2-b358-715760250d8c", ResourceVersion:"1535", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 58, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.3.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:10.567313 containerd[1468]: 2025-02-13 15:58:10.494 [INFO][4183] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.3.67/32] ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" Feb 13 15:58:10.567313 containerd[1468]: 2025-02-13 15:58:10.494 [INFO][4183] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60e51b789ff ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" Feb 13 15:58:10.567313 containerd[1468]: 2025-02-13 15:58:10.505 [INFO][4183] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" Feb 13 15:58:10.568893 containerd[1468]: 2025-02-13 15:58:10.507 [INFO][4183] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"7d0f2e26-e4cd-45e2-b358-715760250d8c", ResourceVersion:"1535", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 58, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.3.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"6a:a2:e0:0a:8b:46", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:10.568893 containerd[1468]: 2025-02-13 15:58:10.557 [INFO][4183] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="143.198.68.221-k8s-nfs--server--provisioner--0-eth0" Feb 13 15:58:10.644279 containerd[1468]: time="2025-02-13T15:58:10.642523553Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:10.644279 containerd[1468]: time="2025-02-13T15:58:10.642646387Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:10.644279 containerd[1468]: time="2025-02-13T15:58:10.642671638Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:10.644279 containerd[1468]: time="2025-02-13T15:58:10.643032441Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:10.703762 systemd[1]: Started cri-containerd-a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59.scope - libcontainer container a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59. Feb 13 15:58:10.796966 containerd[1468]: time="2025-02-13T15:58:10.795094411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:7d0f2e26-e4cd-45e2-b358-715760250d8c,Namespace:default,Attempt:0,} returns sandbox id \"a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59\"" Feb 13 15:58:10.802603 containerd[1468]: time="2025-02-13T15:58:10.802541778Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\"" Feb 13 15:58:10.939379 systemd[1]: run-containerd-runc-k8s.io-a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59-runc.mDIHq1.mount: Deactivated successfully. Feb 13 15:58:11.030342 kubelet[1782]: E0213 15:58:11.030241 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:11.683286 systemd-networkd[1360]: cali60e51b789ff: Gained IPv6LL Feb 13 15:58:12.031683 kubelet[1782]: E0213 15:58:12.031387 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:13.031920 kubelet[1782]: E0213 15:58:13.031803 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:14.032978 kubelet[1782]: E0213 15:58:14.032918 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:14.110812 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2002524714.mount: Deactivated successfully. Feb 13 15:58:14.889604 kubelet[1782]: E0213 15:58:14.889536 1782 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:14.973791 containerd[1468]: time="2025-02-13T15:58:14.973683917Z" level=info msg="StopPodSandbox for \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\"" Feb 13 15:58:14.976031 containerd[1468]: time="2025-02-13T15:58:14.975889398Z" level=info msg="TearDown network for sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" successfully" Feb 13 15:58:14.976386 containerd[1468]: time="2025-02-13T15:58:14.976158921Z" level=info msg="StopPodSandbox for \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" returns successfully" Feb 13 15:58:14.984820 containerd[1468]: time="2025-02-13T15:58:14.984723590Z" level=info msg="RemovePodSandbox for \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\"" Feb 13 15:58:15.005721 containerd[1468]: time="2025-02-13T15:58:15.005647485Z" level=info msg="Forcibly stopping sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\"" Feb 13 15:58:15.006191 containerd[1468]: time="2025-02-13T15:58:15.006058266Z" level=info msg="TearDown network for sandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" successfully" Feb 13 15:58:15.028322 containerd[1468]: time="2025-02-13T15:58:15.028078062Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.028703 containerd[1468]: time="2025-02-13T15:58:15.028653553Z" level=info msg="RemovePodSandbox \"7f16264125abbc461783271448fec548aab88563ec99ef643a82a27465a29078\" returns successfully" Feb 13 15:58:15.032821 containerd[1468]: time="2025-02-13T15:58:15.032753825Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:58:15.033029 containerd[1468]: time="2025-02-13T15:58:15.032938937Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:58:15.033029 containerd[1468]: time="2025-02-13T15:58:15.033012477Z" level=info msg="StopPodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:58:15.033900 kubelet[1782]: E0213 15:58:15.033702 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:15.034457 containerd[1468]: time="2025-02-13T15:58:15.033981427Z" level=info msg="RemovePodSandbox for \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:58:15.034457 containerd[1468]: time="2025-02-13T15:58:15.034024210Z" level=info msg="Forcibly stopping sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\"" Feb 13 15:58:15.034457 containerd[1468]: time="2025-02-13T15:58:15.034166096Z" level=info msg="TearDown network for sandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" successfully" Feb 13 15:58:15.042449 containerd[1468]: time="2025-02-13T15:58:15.042338996Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.042666 containerd[1468]: time="2025-02-13T15:58:15.042478964Z" level=info msg="RemovePodSandbox \"91a2ef10ad520d904211a4a8c8bde25eb5746b784977ed25b1d839596f45d1c7\" returns successfully" Feb 13 15:58:15.043988 containerd[1468]: time="2025-02-13T15:58:15.043701426Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:58:15.043988 containerd[1468]: time="2025-02-13T15:58:15.043863435Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:58:15.043988 containerd[1468]: time="2025-02-13T15:58:15.043882905Z" level=info msg="StopPodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:58:15.046166 containerd[1468]: time="2025-02-13T15:58:15.045621814Z" level=info msg="RemovePodSandbox for \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:58:15.046369 containerd[1468]: time="2025-02-13T15:58:15.046215450Z" level=info msg="Forcibly stopping sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\"" Feb 13 15:58:15.046463 containerd[1468]: time="2025-02-13T15:58:15.046394986Z" level=info msg="TearDown network for sandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" successfully" Feb 13 15:58:15.066919 containerd[1468]: time="2025-02-13T15:58:15.066636974Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.066919 containerd[1468]: time="2025-02-13T15:58:15.066768486Z" level=info msg="RemovePodSandbox \"93700d4ec728542336890fc255c9d8267e54aae0338e15ab343f80e40d8d7ccf\" returns successfully" Feb 13 15:58:15.068153 containerd[1468]: time="2025-02-13T15:58:15.067981464Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:58:15.068793 containerd[1468]: time="2025-02-13T15:58:15.068401739Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:58:15.068793 containerd[1468]: time="2025-02-13T15:58:15.068430927Z" level=info msg="StopPodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:58:15.069495 containerd[1468]: time="2025-02-13T15:58:15.069454046Z" level=info msg="RemovePodSandbox for \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:58:15.069642 containerd[1468]: time="2025-02-13T15:58:15.069582524Z" level=info msg="Forcibly stopping sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\"" Feb 13 15:58:15.069881 containerd[1468]: time="2025-02-13T15:58:15.069804712Z" level=info msg="TearDown network for sandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" successfully" Feb 13 15:58:15.083549 containerd[1468]: time="2025-02-13T15:58:15.083416426Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.084136 containerd[1468]: time="2025-02-13T15:58:15.083774905Z" level=info msg="RemovePodSandbox \"195bfb7b03b925e40a769b66ebfa441e50272ea75187a8831400bc6e268172eb\" returns successfully" Feb 13 15:58:15.086006 containerd[1468]: time="2025-02-13T15:58:15.085703674Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:58:15.086006 containerd[1468]: time="2025-02-13T15:58:15.085879068Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:58:15.086006 containerd[1468]: time="2025-02-13T15:58:15.085900439Z" level=info msg="StopPodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:58:15.088331 containerd[1468]: time="2025-02-13T15:58:15.088223264Z" level=info msg="RemovePodSandbox for \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:58:15.088577 containerd[1468]: time="2025-02-13T15:58:15.088521440Z" level=info msg="Forcibly stopping sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\"" Feb 13 15:58:15.089199 containerd[1468]: time="2025-02-13T15:58:15.088999044Z" level=info msg="TearDown network for sandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" successfully" Feb 13 15:58:15.105162 containerd[1468]: time="2025-02-13T15:58:15.105018278Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.105162 containerd[1468]: time="2025-02-13T15:58:15.105141342Z" level=info msg="RemovePodSandbox \"2b4e4803dce646a7bed3af77228655cc9b20d454b279d6f66ae8bdc12a7bf6f2\" returns successfully" Feb 13 15:58:15.123193 containerd[1468]: time="2025-02-13T15:58:15.108875645Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:58:15.123193 containerd[1468]: time="2025-02-13T15:58:15.109017469Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:58:15.123193 containerd[1468]: time="2025-02-13T15:58:15.109029381Z" level=info msg="StopPodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:58:15.123193 containerd[1468]: time="2025-02-13T15:58:15.113758999Z" level=info msg="RemovePodSandbox for \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:58:15.123193 containerd[1468]: time="2025-02-13T15:58:15.113839165Z" level=info msg="Forcibly stopping sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\"" Feb 13 15:58:15.131928 containerd[1468]: time="2025-02-13T15:58:15.131418518Z" level=info msg="TearDown network for sandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" successfully" Feb 13 15:58:15.236812 containerd[1468]: time="2025-02-13T15:58:15.235728583Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.236812 containerd[1468]: time="2025-02-13T15:58:15.236008568Z" level=info msg="RemovePodSandbox \"89c74c9f965cbe2cc536ad5c7494b3ece03b9d075a82b1dd6fc53ca7a6ddd0d0\" returns successfully" Feb 13 15:58:15.243002 containerd[1468]: time="2025-02-13T15:58:15.242591575Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:58:15.243002 containerd[1468]: time="2025-02-13T15:58:15.242788568Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:58:15.243002 containerd[1468]: time="2025-02-13T15:58:15.242809261Z" level=info msg="StopPodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:58:15.252167 containerd[1468]: time="2025-02-13T15:58:15.249512376Z" level=info msg="RemovePodSandbox for \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:58:15.252167 containerd[1468]: time="2025-02-13T15:58:15.249585859Z" level=info msg="Forcibly stopping sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\"" Feb 13 15:58:15.252167 containerd[1468]: time="2025-02-13T15:58:15.249728997Z" level=info msg="TearDown network for sandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" successfully" Feb 13 15:58:15.395680 containerd[1468]: time="2025-02-13T15:58:15.395437031Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.395680 containerd[1468]: time="2025-02-13T15:58:15.395535890Z" level=info msg="RemovePodSandbox \"e9adc45b17c2991ba866fbc582f10760b5808cedceaee13846ec67fa6d83adb2\" returns successfully" Feb 13 15:58:15.396923 containerd[1468]: time="2025-02-13T15:58:15.396534286Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:58:15.396923 containerd[1468]: time="2025-02-13T15:58:15.396725811Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:58:15.396923 containerd[1468]: time="2025-02-13T15:58:15.396744901Z" level=info msg="StopPodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:58:15.438902 containerd[1468]: time="2025-02-13T15:58:15.434834103Z" level=info msg="RemovePodSandbox for \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:58:15.438902 containerd[1468]: time="2025-02-13T15:58:15.434906282Z" level=info msg="Forcibly stopping sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\"" Feb 13 15:58:15.438902 containerd[1468]: time="2025-02-13T15:58:15.435021336Z" level=info msg="TearDown network for sandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" successfully" Feb 13 15:58:15.442152 containerd[1468]: time="2025-02-13T15:58:15.442033479Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.442444 containerd[1468]: time="2025-02-13T15:58:15.442181789Z" level=info msg="RemovePodSandbox \"a25bfb7b4a1206d63068ddb41a864f98c25869c3cccadc85a2e818bb78fccd65\" returns successfully" Feb 13 15:58:15.445922 containerd[1468]: time="2025-02-13T15:58:15.445843054Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:58:15.446103 containerd[1468]: time="2025-02-13T15:58:15.446016586Z" level=info msg="TearDown network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" successfully" Feb 13 15:58:15.446103 containerd[1468]: time="2025-02-13T15:58:15.446038045Z" level=info msg="StopPodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" returns successfully" Feb 13 15:58:15.449597 containerd[1468]: time="2025-02-13T15:58:15.449529524Z" level=info msg="RemovePodSandbox for \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:58:15.449597 containerd[1468]: time="2025-02-13T15:58:15.449602019Z" level=info msg="Forcibly stopping sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\"" Feb 13 15:58:15.449909 containerd[1468]: time="2025-02-13T15:58:15.449748239Z" level=info msg="TearDown network for sandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" successfully" Feb 13 15:58:15.460528 containerd[1468]: time="2025-02-13T15:58:15.460369679Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.460528 containerd[1468]: time="2025-02-13T15:58:15.460467449Z" level=info msg="RemovePodSandbox \"f847bf0dbcbe636c20d74905ff698a1be1b017c7dd9eebe36901f75fe65088b1\" returns successfully" Feb 13 15:58:15.462478 containerd[1468]: time="2025-02-13T15:58:15.462378764Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" Feb 13 15:58:15.462641 containerd[1468]: time="2025-02-13T15:58:15.462563566Z" level=info msg="TearDown network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" successfully" Feb 13 15:58:15.462641 containerd[1468]: time="2025-02-13T15:58:15.462582700Z" level=info msg="StopPodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" returns successfully" Feb 13 15:58:15.464465 containerd[1468]: time="2025-02-13T15:58:15.463831530Z" level=info msg="RemovePodSandbox for \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" Feb 13 15:58:15.464465 containerd[1468]: time="2025-02-13T15:58:15.463892040Z" level=info msg="Forcibly stopping sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\"" Feb 13 15:58:15.464465 containerd[1468]: time="2025-02-13T15:58:15.464017689Z" level=info msg="TearDown network for sandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" successfully" Feb 13 15:58:15.471114 containerd[1468]: time="2025-02-13T15:58:15.471044781Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.471477 containerd[1468]: time="2025-02-13T15:58:15.471439599Z" level=info msg="RemovePodSandbox \"3229f78b07e65d0dabd57562318d5ffb80819f17c136adca2ec9bf52d91c5266\" returns successfully" Feb 13 15:58:15.473946 containerd[1468]: time="2025-02-13T15:58:15.473642093Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\"" Feb 13 15:58:15.474445 containerd[1468]: time="2025-02-13T15:58:15.474409252Z" level=info msg="TearDown network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" successfully" Feb 13 15:58:15.474562 containerd[1468]: time="2025-02-13T15:58:15.474541296Z" level=info msg="StopPodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" returns successfully" Feb 13 15:58:15.480640 containerd[1468]: time="2025-02-13T15:58:15.479953430Z" level=info msg="RemovePodSandbox for \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\"" Feb 13 15:58:15.481142 containerd[1468]: time="2025-02-13T15:58:15.480868934Z" level=info msg="Forcibly stopping sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\"" Feb 13 15:58:15.481142 containerd[1468]: time="2025-02-13T15:58:15.481051660Z" level=info msg="TearDown network for sandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" successfully" Feb 13 15:58:15.488556 containerd[1468]: time="2025-02-13T15:58:15.488305593Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.489924 containerd[1468]: time="2025-02-13T15:58:15.489767153Z" level=info msg="RemovePodSandbox \"fbbdbbf57bc58dbbb7c36845a29afd775e19e13d886405906216628b94b549b2\" returns successfully" Feb 13 15:58:15.492473 containerd[1468]: time="2025-02-13T15:58:15.491429077Z" level=info msg="StopPodSandbox for \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\"" Feb 13 15:58:15.492473 containerd[1468]: time="2025-02-13T15:58:15.491616060Z" level=info msg="TearDown network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" successfully" Feb 13 15:58:15.492473 containerd[1468]: time="2025-02-13T15:58:15.491759867Z" level=info msg="StopPodSandbox for \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" returns successfully" Feb 13 15:58:15.496976 containerd[1468]: time="2025-02-13T15:58:15.496907153Z" level=info msg="RemovePodSandbox for \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\"" Feb 13 15:58:15.497749 containerd[1468]: time="2025-02-13T15:58:15.497704604Z" level=info msg="Forcibly stopping sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\"" Feb 13 15:58:15.499101 containerd[1468]: time="2025-02-13T15:58:15.498656319Z" level=info msg="TearDown network for sandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" successfully" Feb 13 15:58:15.532176 containerd[1468]: time="2025-02-13T15:58:15.531849098Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.532176 containerd[1468]: time="2025-02-13T15:58:15.531941689Z" level=info msg="RemovePodSandbox \"9ae3d6238bd3da5930782627160d94bd09457f850b479dc7693e4dd2460153c3\" returns successfully" Feb 13 15:58:15.534165 containerd[1468]: time="2025-02-13T15:58:15.533580983Z" level=info msg="StopPodSandbox for \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\"" Feb 13 15:58:15.534165 containerd[1468]: time="2025-02-13T15:58:15.533745555Z" level=info msg="TearDown network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\" successfully" Feb 13 15:58:15.534165 containerd[1468]: time="2025-02-13T15:58:15.533766519Z" level=info msg="StopPodSandbox for \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\" returns successfully" Feb 13 15:58:15.535633 containerd[1468]: time="2025-02-13T15:58:15.534719518Z" level=info msg="RemovePodSandbox for \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\"" Feb 13 15:58:15.535633 containerd[1468]: time="2025-02-13T15:58:15.534768029Z" level=info msg="Forcibly stopping sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\"" Feb 13 15:58:15.535633 containerd[1468]: time="2025-02-13T15:58:15.534848134Z" level=info msg="TearDown network for sandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\" successfully" Feb 13 15:58:15.542628 containerd[1468]: time="2025-02-13T15:58:15.542558001Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.542785 containerd[1468]: time="2025-02-13T15:58:15.542652940Z" level=info msg="RemovePodSandbox \"2ba8bde7d8757cc5ed5566bb4260fd6dcd103277fcda39c37ba14963a8283762\" returns successfully" Feb 13 15:58:15.544002 containerd[1468]: time="2025-02-13T15:58:15.543959687Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:58:15.544347 containerd[1468]: time="2025-02-13T15:58:15.544284597Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:58:15.544632 containerd[1468]: time="2025-02-13T15:58:15.544611737Z" level=info msg="StopPodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:58:15.546323 containerd[1468]: time="2025-02-13T15:58:15.546280095Z" level=info msg="RemovePodSandbox for \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:58:15.547255 containerd[1468]: time="2025-02-13T15:58:15.546685234Z" level=info msg="Forcibly stopping sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\"" Feb 13 15:58:15.547255 containerd[1468]: time="2025-02-13T15:58:15.546837713Z" level=info msg="TearDown network for sandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" successfully" Feb 13 15:58:15.566539 containerd[1468]: time="2025-02-13T15:58:15.566429689Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.566539 containerd[1468]: time="2025-02-13T15:58:15.566529544Z" level=info msg="RemovePodSandbox \"46f712329b69a435d3e2ba0857e532b92503ce04a3df3a889272fb534bfae610\" returns successfully" Feb 13 15:58:15.569204 containerd[1468]: time="2025-02-13T15:58:15.569142242Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:58:15.572270 containerd[1468]: time="2025-02-13T15:58:15.569307536Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:58:15.572270 containerd[1468]: time="2025-02-13T15:58:15.569324460Z" level=info msg="StopPodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:58:15.574189 containerd[1468]: time="2025-02-13T15:58:15.573484392Z" level=info msg="RemovePodSandbox for \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:58:15.574189 containerd[1468]: time="2025-02-13T15:58:15.573544995Z" level=info msg="Forcibly stopping sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\"" Feb 13 15:58:15.574189 containerd[1468]: time="2025-02-13T15:58:15.573663630Z" level=info msg="TearDown network for sandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" successfully" Feb 13 15:58:15.679865 containerd[1468]: time="2025-02-13T15:58:15.679808018Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.680189 containerd[1468]: time="2025-02-13T15:58:15.680085777Z" level=info msg="RemovePodSandbox \"63003d60d57aba964c29ad5e54dedbbfff20ee67e31265adb1545daedcc519b4\" returns successfully" Feb 13 15:58:15.683206 containerd[1468]: time="2025-02-13T15:58:15.682821001Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:58:15.684250 containerd[1468]: time="2025-02-13T15:58:15.683887858Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:58:15.684250 containerd[1468]: time="2025-02-13T15:58:15.683944959Z" level=info msg="StopPodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:58:15.685007 containerd[1468]: time="2025-02-13T15:58:15.684875378Z" level=info msg="RemovePodSandbox for \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:58:15.685007 containerd[1468]: time="2025-02-13T15:58:15.684930576Z" level=info msg="Forcibly stopping sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\"" Feb 13 15:58:15.685248 containerd[1468]: time="2025-02-13T15:58:15.685096416Z" level=info msg="TearDown network for sandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" successfully" Feb 13 15:58:15.697273 containerd[1468]: time="2025-02-13T15:58:15.696969332Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.697273 containerd[1468]: time="2025-02-13T15:58:15.697071212Z" level=info msg="RemovePodSandbox \"85965a1e832c60eff0f86d4ee3432dd2ea3e78d43d31411e9e6d07ce5254f646\" returns successfully" Feb 13 15:58:15.699157 containerd[1468]: time="2025-02-13T15:58:15.698036754Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:58:15.699157 containerd[1468]: time="2025-02-13T15:58:15.698300770Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:58:15.699157 containerd[1468]: time="2025-02-13T15:58:15.698325459Z" level=info msg="StopPodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:58:15.701152 containerd[1468]: time="2025-02-13T15:58:15.700487130Z" level=info msg="RemovePodSandbox for \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:58:15.701152 containerd[1468]: time="2025-02-13T15:58:15.700550613Z" level=info msg="Forcibly stopping sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\"" Feb 13 15:58:15.701152 containerd[1468]: time="2025-02-13T15:58:15.700675268Z" level=info msg="TearDown network for sandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" successfully" Feb 13 15:58:15.707002 containerd[1468]: time="2025-02-13T15:58:15.706925603Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.707243 containerd[1468]: time="2025-02-13T15:58:15.707023720Z" level=info msg="RemovePodSandbox \"e3f70af0344a9477ce1701728b02d8531b52e12c5d3ce6df0742909d97d51b74\" returns successfully" Feb 13 15:58:15.708334 containerd[1468]: time="2025-02-13T15:58:15.708250799Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:58:15.708514 containerd[1468]: time="2025-02-13T15:58:15.708491156Z" level=info msg="TearDown network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" successfully" Feb 13 15:58:15.708563 containerd[1468]: time="2025-02-13T15:58:15.708536011Z" level=info msg="StopPodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" returns successfully" Feb 13 15:58:15.716176 containerd[1468]: time="2025-02-13T15:58:15.709781673Z" level=info msg="RemovePodSandbox for \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:58:15.716176 containerd[1468]: time="2025-02-13T15:58:15.709854696Z" level=info msg="Forcibly stopping sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\"" Feb 13 15:58:15.716176 containerd[1468]: time="2025-02-13T15:58:15.710038183Z" level=info msg="TearDown network for sandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" successfully" Feb 13 15:58:15.718031 containerd[1468]: time="2025-02-13T15:58:15.717965385Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.718765 containerd[1468]: time="2025-02-13T15:58:15.718062870Z" level=info msg="RemovePodSandbox \"dee8c5f770800f247cb4bfcf37f3b0728048ae6da099d462d38f829c20c440b0\" returns successfully" Feb 13 15:58:15.720195 containerd[1468]: time="2025-02-13T15:58:15.720105120Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" Feb 13 15:58:15.720646 containerd[1468]: time="2025-02-13T15:58:15.720540681Z" level=info msg="TearDown network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" successfully" Feb 13 15:58:15.720719 containerd[1468]: time="2025-02-13T15:58:15.720647197Z" level=info msg="StopPodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" returns successfully" Feb 13 15:58:15.721466 containerd[1468]: time="2025-02-13T15:58:15.721427184Z" level=info msg="RemovePodSandbox for \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" Feb 13 15:58:15.721550 containerd[1468]: time="2025-02-13T15:58:15.721471206Z" level=info msg="Forcibly stopping sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\"" Feb 13 15:58:15.721699 containerd[1468]: time="2025-02-13T15:58:15.721580655Z" level=info msg="TearDown network for sandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" successfully" Feb 13 15:58:15.729108 containerd[1468]: time="2025-02-13T15:58:15.729035793Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.729632 containerd[1468]: time="2025-02-13T15:58:15.729230986Z" level=info msg="RemovePodSandbox \"5e6094d2f96af2dd8a6594751b3a422dccab8a2e7011bf0a21c81db29dd905cc\" returns successfully" Feb 13 15:58:15.730583 containerd[1468]: time="2025-02-13T15:58:15.730302189Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\"" Feb 13 15:58:15.730583 containerd[1468]: time="2025-02-13T15:58:15.730456597Z" level=info msg="TearDown network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" successfully" Feb 13 15:58:15.730583 containerd[1468]: time="2025-02-13T15:58:15.730474675Z" level=info msg="StopPodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" returns successfully" Feb 13 15:58:15.732180 containerd[1468]: time="2025-02-13T15:58:15.732049112Z" level=info msg="RemovePodSandbox for \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\"" Feb 13 15:58:15.732180 containerd[1468]: time="2025-02-13T15:58:15.732095652Z" level=info msg="Forcibly stopping sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\"" Feb 13 15:58:15.734991 containerd[1468]: time="2025-02-13T15:58:15.734872815Z" level=info msg="TearDown network for sandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" successfully" Feb 13 15:58:15.756145 containerd[1468]: time="2025-02-13T15:58:15.751560580Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.756145 containerd[1468]: time="2025-02-13T15:58:15.751662288Z" level=info msg="RemovePodSandbox \"694ad1103ff7f61bbf63b262da593b7e3f651ed7f36b87d675e7202b1c234a75\" returns successfully" Feb 13 15:58:15.759300 containerd[1468]: time="2025-02-13T15:58:15.758593170Z" level=info msg="StopPodSandbox for \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\"" Feb 13 15:58:15.759300 containerd[1468]: time="2025-02-13T15:58:15.758768043Z" level=info msg="TearDown network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" successfully" Feb 13 15:58:15.759300 containerd[1468]: time="2025-02-13T15:58:15.758786550Z" level=info msg="StopPodSandbox for \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" returns successfully" Feb 13 15:58:15.760807 containerd[1468]: time="2025-02-13T15:58:15.760669558Z" level=info msg="RemovePodSandbox for \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\"" Feb 13 15:58:15.761155 containerd[1468]: time="2025-02-13T15:58:15.760775937Z" level=info msg="Forcibly stopping sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\"" Feb 13 15:58:15.761811 containerd[1468]: time="2025-02-13T15:58:15.761585255Z" level=info msg="TearDown network for sandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" successfully" Feb 13 15:58:15.790188 containerd[1468]: time="2025-02-13T15:58:15.789595348Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:15.790763 containerd[1468]: time="2025-02-13T15:58:15.790540210Z" level=info msg="RemovePodSandbox \"467e14ffbaf0fe68dd3c9f644c5ae643187830d7077d38d418f8ea6d97f35eb6\" returns successfully" Feb 13 15:58:16.034419 kubelet[1782]: E0213 15:58:16.033990 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:17.035528 kubelet[1782]: E0213 15:58:17.035428 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:18.036689 kubelet[1782]: E0213 15:58:18.036628 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:19.039108 kubelet[1782]: E0213 15:58:19.039039 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:19.291927 containerd[1468]: time="2025-02-13T15:58:19.291354797Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:19.295311 containerd[1468]: time="2025-02-13T15:58:19.294888194Z" level=info msg="stop pulling image registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8: active requests=0, bytes read=91039406" Feb 13 15:58:19.297185 containerd[1468]: time="2025-02-13T15:58:19.296181387Z" level=info msg="ImageCreate event name:\"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:19.311473 containerd[1468]: time="2025-02-13T15:58:19.311391409Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:19.321623 containerd[1468]: time="2025-02-13T15:58:19.321533156Z" level=info msg="Pulled image \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" with image id \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\", repo tag \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\", repo digest \"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\", size \"91036984\" in 8.518912215s" Feb 13 15:58:19.322036 containerd[1468]: time="2025-02-13T15:58:19.321997863Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" returns image reference \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\"" Feb 13 15:58:19.376335 containerd[1468]: time="2025-02-13T15:58:19.368249747Z" level=info msg="CreateContainer within sandbox \"a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59\" for container &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,}" Feb 13 15:58:19.413004 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1836796002.mount: Deactivated successfully. Feb 13 15:58:19.414896 containerd[1468]: time="2025-02-13T15:58:19.413001361Z" level=info msg="CreateContainer within sandbox \"a0b101d042220474f400b95d30b0db4041b5dbc6f16c15bfb4d5db39916d5d59\" for &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,} returns container id \"b9d9c9eed117a8ba9e3b8cb7a975c68708bc3e6a83e0daabceff42017de6261c\"" Feb 13 15:58:19.414896 containerd[1468]: time="2025-02-13T15:58:19.414576481Z" level=info msg="StartContainer for \"b9d9c9eed117a8ba9e3b8cb7a975c68708bc3e6a83e0daabceff42017de6261c\"" Feb 13 15:58:19.488266 systemd[1]: Started cri-containerd-b9d9c9eed117a8ba9e3b8cb7a975c68708bc3e6a83e0daabceff42017de6261c.scope - libcontainer container b9d9c9eed117a8ba9e3b8cb7a975c68708bc3e6a83e0daabceff42017de6261c. Feb 13 15:58:19.640212 containerd[1468]: time="2025-02-13T15:58:19.638196788Z" level=info msg="StartContainer for \"b9d9c9eed117a8ba9e3b8cb7a975c68708bc3e6a83e0daabceff42017de6261c\" returns successfully" Feb 13 15:58:20.039557 kubelet[1782]: E0213 15:58:20.039389 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:21.040290 kubelet[1782]: E0213 15:58:21.040173 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:21.162947 kubelet[1782]: E0213 15:58:21.162241 1782 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" Feb 13 15:58:21.253082 kubelet[1782]: I0213 15:58:21.252813 1782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nfs-server-provisioner-0" podStartSLOduration=3.729264525 podStartE2EDuration="12.25277956s" podCreationTimestamp="2025-02-13 15:58:09 +0000 UTC" firstStartedPulling="2025-02-13 15:58:10.801423482 +0000 UTC m=+56.536754273" lastFinishedPulling="2025-02-13 15:58:19.324938519 +0000 UTC m=+65.060269308" observedRunningTime="2025-02-13 15:58:20.110596388 +0000 UTC m=+65.845927184" watchObservedRunningTime="2025-02-13 15:58:21.25277956 +0000 UTC m=+66.988110361" Feb 13 15:58:22.041044 kubelet[1782]: E0213 15:58:22.040961 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:23.042044 kubelet[1782]: E0213 15:58:23.041946 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:24.042922 kubelet[1782]: E0213 15:58:24.042807 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:25.052919 kubelet[1782]: E0213 15:58:25.043463 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:26.053583 kubelet[1782]: E0213 15:58:26.053487 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:27.054050 kubelet[1782]: E0213 15:58:27.053962 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:28.054828 kubelet[1782]: E0213 15:58:28.054723 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:29.055356 kubelet[1782]: E0213 15:58:29.055272 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:29.470006 kubelet[1782]: I0213 15:58:29.468569 1782 topology_manager.go:215] "Topology Admit Handler" podUID="606d0840-0695-49f2-81f7-601c16230f03" podNamespace="default" podName="test-pod-1" Feb 13 15:58:29.483354 systemd[1]: Created slice kubepods-besteffort-pod606d0840_0695_49f2_81f7_601c16230f03.slice - libcontainer container kubepods-besteffort-pod606d0840_0695_49f2_81f7_601c16230f03.slice. Feb 13 15:58:29.675478 kubelet[1782]: I0213 15:58:29.674983 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32d93398-3cfe-4106-9c5c-009695e97020\" (UniqueName: \"kubernetes.io/nfs/606d0840-0695-49f2-81f7-601c16230f03-pvc-32d93398-3cfe-4106-9c5c-009695e97020\") pod \"test-pod-1\" (UID: \"606d0840-0695-49f2-81f7-601c16230f03\") " pod="default/test-pod-1" Feb 13 15:58:29.675478 kubelet[1782]: I0213 15:58:29.675074 1782 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8p5g\" (UniqueName: \"kubernetes.io/projected/606d0840-0695-49f2-81f7-601c16230f03-kube-api-access-w8p5g\") pod \"test-pod-1\" (UID: \"606d0840-0695-49f2-81f7-601c16230f03\") " pod="default/test-pod-1" Feb 13 15:58:29.840319 kernel: FS-Cache: Loaded Feb 13 15:58:29.952601 kernel: RPC: Registered named UNIX socket transport module. Feb 13 15:58:29.952766 kernel: RPC: Registered udp transport module. Feb 13 15:58:29.952798 kernel: RPC: Registered tcp transport module. Feb 13 15:58:29.954734 kernel: RPC: Registered tcp-with-tls transport module. Feb 13 15:58:29.954885 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Feb 13 15:58:30.055851 kubelet[1782]: E0213 15:58:30.055799 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:30.341494 kernel: NFS: Registering the id_resolver key type Feb 13 15:58:30.341704 kernel: Key type id_resolver registered Feb 13 15:58:30.343323 kernel: Key type id_legacy registered Feb 13 15:58:30.427072 nfsidmap[4409]: nss_getpwnam: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain '1.1-1-7a196a8365' Feb 13 15:58:30.433189 nfsidmap[4412]: nss_name_to_gid: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain '1.1-1-7a196a8365' Feb 13 15:58:30.715585 containerd[1468]: time="2025-02-13T15:58:30.715328342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:606d0840-0695-49f2-81f7-601c16230f03,Namespace:default,Attempt:0,}" Feb 13 15:58:31.057948 kubelet[1782]: E0213 15:58:31.057673 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:31.161980 systemd-networkd[1360]: cali5ec59c6bf6e: Link UP Feb 13 15:58:31.163499 systemd-networkd[1360]: cali5ec59c6bf6e: Gained carrier Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:30.879 [INFO][4415] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {143.198.68.221-k8s-test--pod--1-eth0 default 606d0840-0695-49f2-81f7-601c16230f03 1645 0 2025-02-13 15:58:10 +0000 UTC map[projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 143.198.68.221 test-pod-1 eth0 default [] [] [kns.default ksa.default.default] cali5ec59c6bf6e [] []}} ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:30.879 [INFO][4415] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-eth0" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.009 [INFO][4426] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" HandleID="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Workload="143.198.68.221-k8s-test--pod--1-eth0" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.064 [INFO][4426] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" HandleID="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Workload="143.198.68.221-k8s-test--pod--1-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00046f690), Attrs:map[string]string{"namespace":"default", "node":"143.198.68.221", "pod":"test-pod-1", "timestamp":"2025-02-13 15:58:31.00909965 +0000 UTC"}, Hostname:"143.198.68.221", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.065 [INFO][4426] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.065 [INFO][4426] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.065 [INFO][4426] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '143.198.68.221' Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.075 [INFO][4426] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.085 [INFO][4426] ipam/ipam.go 372: Looking up existing affinities for host host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.099 [INFO][4426] ipam/ipam.go 489: Trying affinity for 192.168.3.64/26 host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.105 [INFO][4426] ipam/ipam.go 155: Attempting to load block cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.115 [INFO][4426] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.115 [INFO][4426] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.121 [INFO][4426] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8 Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.137 [INFO][4426] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.151 [INFO][4426] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.3.68/26] block=192.168.3.64/26 handle="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.152 [INFO][4426] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.3.68/26] handle="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" host="143.198.68.221" Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.152 [INFO][4426] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:31.186788 containerd[1468]: 2025-02-13 15:58:31.152 [INFO][4426] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.68/26] IPv6=[] ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" HandleID="k8s-pod-network.6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Workload="143.198.68.221-k8s-test--pod--1-eth0" Feb 13 15:58:31.188668 containerd[1468]: 2025-02-13 15:58:31.157 [INFO][4415] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"606d0840-0695-49f2-81f7-601c16230f03", ResourceVersion:"1645", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 58, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.3.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:31.188668 containerd[1468]: 2025-02-13 15:58:31.157 [INFO][4415] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.3.68/32] ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-eth0" Feb 13 15:58:31.188668 containerd[1468]: 2025-02-13 15:58:31.157 [INFO][4415] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ec59c6bf6e ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-eth0" Feb 13 15:58:31.188668 containerd[1468]: 2025-02-13 15:58:31.162 [INFO][4415] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-eth0" Feb 13 15:58:31.188668 containerd[1468]: 2025-02-13 15:58:31.163 [INFO][4415] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"143.198.68.221-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"606d0840-0695-49f2-81f7-601c16230f03", ResourceVersion:"1645", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 58, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"143.198.68.221", ContainerID:"6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.3.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"56:af:35:ff:92:7e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:31.188668 containerd[1468]: 2025-02-13 15:58:31.181 [INFO][4415] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="143.198.68.221-k8s-test--pod--1-eth0" Feb 13 15:58:31.245679 containerd[1468]: time="2025-02-13T15:58:31.245250652Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:31.245679 containerd[1468]: time="2025-02-13T15:58:31.245336350Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:31.245679 containerd[1468]: time="2025-02-13T15:58:31.245353041Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:31.245679 containerd[1468]: time="2025-02-13T15:58:31.245526424Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:31.287685 systemd[1]: Started cri-containerd-6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8.scope - libcontainer container 6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8. Feb 13 15:58:31.360307 containerd[1468]: time="2025-02-13T15:58:31.360112147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:606d0840-0695-49f2-81f7-601c16230f03,Namespace:default,Attempt:0,} returns sandbox id \"6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8\"" Feb 13 15:58:31.369453 containerd[1468]: time="2025-02-13T15:58:31.368882079Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 15:58:32.028553 containerd[1468]: time="2025-02-13T15:58:32.027003199Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:32.028553 containerd[1468]: time="2025-02-13T15:58:32.028459486Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=61" Feb 13 15:58:32.033980 containerd[1468]: time="2025-02-13T15:58:32.033830917Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 664.873298ms" Feb 13 15:58:32.033980 containerd[1468]: time="2025-02-13T15:58:32.033917860Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 15:58:32.038032 containerd[1468]: time="2025-02-13T15:58:32.037728128Z" level=info msg="CreateContainer within sandbox \"6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8\" for container &ContainerMetadata{Name:test,Attempt:0,}" Feb 13 15:58:32.058994 kubelet[1782]: E0213 15:58:32.058928 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:32.074418 containerd[1468]: time="2025-02-13T15:58:32.073033419Z" level=info msg="CreateContainer within sandbox \"6d0b6e13e836249fa1fd03d6dd2f061f77b42f31c521a1257b8b8626878a10c8\" for &ContainerMetadata{Name:test,Attempt:0,} returns container id \"d0df6c61c1f844897ff6840037a83569882a1d6853d4947c97837b20de78477b\"" Feb 13 15:58:32.074706 containerd[1468]: time="2025-02-13T15:58:32.074642767Z" level=info msg="StartContainer for \"d0df6c61c1f844897ff6840037a83569882a1d6853d4947c97837b20de78477b\"" Feb 13 15:58:32.153484 systemd[1]: Started cri-containerd-d0df6c61c1f844897ff6840037a83569882a1d6853d4947c97837b20de78477b.scope - libcontainer container d0df6c61c1f844897ff6840037a83569882a1d6853d4947c97837b20de78477b. Feb 13 15:58:32.208096 containerd[1468]: time="2025-02-13T15:58:32.207890067Z" level=info msg="StartContainer for \"d0df6c61c1f844897ff6840037a83569882a1d6853d4947c97837b20de78477b\" returns successfully" Feb 13 15:58:32.417501 systemd-networkd[1360]: cali5ec59c6bf6e: Gained IPv6LL Feb 13 15:58:33.059701 kubelet[1782]: E0213 15:58:33.059591 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:34.059922 kubelet[1782]: E0213 15:58:34.059791 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:34.890636 kubelet[1782]: E0213 15:58:34.890421 1782 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:35.060434 kubelet[1782]: E0213 15:58:35.060351 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:36.061810 kubelet[1782]: E0213 15:58:36.061179 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 15:58:37.062497 kubelet[1782]: E0213 15:58:37.062364 1782 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests"