Jan 13 21:48:40.058842 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 18:58:40 -00 2025 Jan 13 21:48:40.058894 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 21:48:40.058910 kernel: BIOS-provided physical RAM map: Jan 13 21:48:40.058938 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jan 13 21:48:40.058949 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jan 13 21:48:40.058959 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jan 13 21:48:40.058971 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Jan 13 21:48:40.058982 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Jan 13 21:48:40.058993 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jan 13 21:48:40.059016 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jan 13 21:48:40.059045 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 13 21:48:40.059057 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jan 13 21:48:40.059085 kernel: NX (Execute Disable) protection: active Jan 13 21:48:40.059102 kernel: APIC: Static calls initialized Jan 13 21:48:40.059120 kernel: SMBIOS 2.8 present. Jan 13 21:48:40.059134 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Jan 13 21:48:40.059146 kernel: Hypervisor detected: KVM Jan 13 21:48:40.059163 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 13 21:48:40.059176 kernel: kvm-clock: using sched offset of 4424749051 cycles Jan 13 21:48:40.059189 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 13 21:48:40.059202 kernel: tsc: Detected 2499.998 MHz processor Jan 13 21:48:40.059214 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 21:48:40.059227 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 21:48:40.059239 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Jan 13 21:48:40.059251 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jan 13 21:48:40.059264 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 21:48:40.059280 kernel: Using GB pages for direct mapping Jan 13 21:48:40.059293 kernel: ACPI: Early table checksum verification disabled Jan 13 21:48:40.059305 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Jan 13 21:48:40.059317 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 21:48:40.059329 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 21:48:40.059375 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 21:48:40.059387 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Jan 13 21:48:40.059399 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 21:48:40.059411 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 21:48:40.059441 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 21:48:40.059454 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 13 21:48:40.059466 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Jan 13 21:48:40.059485 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Jan 13 21:48:40.059498 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Jan 13 21:48:40.059516 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Jan 13 21:48:40.059529 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Jan 13 21:48:40.059548 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Jan 13 21:48:40.059561 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Jan 13 21:48:40.059574 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 21:48:40.059587 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Jan 13 21:48:40.059600 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 21:48:40.059612 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Jan 13 21:48:40.059625 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 21:48:40.059637 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Jan 13 21:48:40.059654 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 21:48:40.059667 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Jan 13 21:48:40.059680 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 21:48:40.059692 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Jan 13 21:48:40.059705 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 21:48:40.059717 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Jan 13 21:48:40.059730 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 21:48:40.059742 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Jan 13 21:48:40.059755 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 21:48:40.059774 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Jan 13 21:48:40.059787 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 21:48:40.059800 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 21:48:40.059825 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Jan 13 21:48:40.059838 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Jan 13 21:48:40.059850 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Jan 13 21:48:40.059862 kernel: Zone ranges: Jan 13 21:48:40.059875 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 21:48:40.059887 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Jan 13 21:48:40.059899 kernel: Normal empty Jan 13 21:48:40.059927 kernel: Movable zone start for each node Jan 13 21:48:40.059939 kernel: Early memory node ranges Jan 13 21:48:40.059951 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jan 13 21:48:40.059963 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Jan 13 21:48:40.059975 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Jan 13 21:48:40.059986 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 21:48:40.059998 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jan 13 21:48:40.060031 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Jan 13 21:48:40.060046 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 13 21:48:40.060064 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 13 21:48:40.060077 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 13 21:48:40.060090 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 13 21:48:40.060103 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 13 21:48:40.060116 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 13 21:48:40.060128 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 13 21:48:40.060141 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 13 21:48:40.060154 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 21:48:40.060167 kernel: TSC deadline timer available Jan 13 21:48:40.060184 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Jan 13 21:48:40.060196 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 13 21:48:40.060209 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jan 13 21:48:40.060222 kernel: Booting paravirtualized kernel on KVM Jan 13 21:48:40.060235 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 21:48:40.060248 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Jan 13 21:48:40.060260 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 21:48:40.060273 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 21:48:40.060286 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Jan 13 21:48:40.060303 kernel: kvm-guest: PV spinlocks enabled Jan 13 21:48:40.060328 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 13 21:48:40.061259 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 21:48:40.061275 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 21:48:40.061288 kernel: random: crng init done Jan 13 21:48:40.061313 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 21:48:40.061326 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 21:48:40.061358 kernel: Fallback order for Node 0: 0 Jan 13 21:48:40.061390 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Jan 13 21:48:40.061407 kernel: Policy zone: DMA32 Jan 13 21:48:40.061425 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 21:48:40.064422 kernel: software IO TLB: area num 16. Jan 13 21:48:40.064439 kernel: Memory: 1899484K/2096616K available (14336K kernel code, 2299K rwdata, 22800K rodata, 43320K init, 1756K bss, 196872K reserved, 0K cma-reserved) Jan 13 21:48:40.064452 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Jan 13 21:48:40.064464 kernel: Kernel/User page tables isolation: enabled Jan 13 21:48:40.064485 kernel: ftrace: allocating 37890 entries in 149 pages Jan 13 21:48:40.064497 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 21:48:40.064517 kernel: Dynamic Preempt: voluntary Jan 13 21:48:40.064530 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 21:48:40.064559 kernel: rcu: RCU event tracing is enabled. Jan 13 21:48:40.064572 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Jan 13 21:48:40.064586 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 21:48:40.064631 kernel: Rude variant of Tasks RCU enabled. Jan 13 21:48:40.064649 kernel: Tracing variant of Tasks RCU enabled. Jan 13 21:48:40.064662 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 21:48:40.064676 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Jan 13 21:48:40.064689 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Jan 13 21:48:40.064703 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 13 21:48:40.064718 kernel: Console: colour VGA+ 80x25 Jan 13 21:48:40.064735 kernel: printk: console [tty0] enabled Jan 13 21:48:40.064749 kernel: printk: console [ttyS0] enabled Jan 13 21:48:40.064762 kernel: ACPI: Core revision 20230628 Jan 13 21:48:40.064776 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 21:48:40.064789 kernel: x2apic enabled Jan 13 21:48:40.064807 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 21:48:40.064820 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 13 21:48:40.064834 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Jan 13 21:48:40.064847 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 13 21:48:40.064861 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 13 21:48:40.064874 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 13 21:48:40.064887 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 21:48:40.064900 kernel: Spectre V2 : Mitigation: Retpolines Jan 13 21:48:40.064914 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 21:48:40.064946 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Jan 13 21:48:40.064963 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Jan 13 21:48:40.064976 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 21:48:40.065001 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 21:48:40.065013 kernel: MDS: Mitigation: Clear CPU buffers Jan 13 21:48:40.065034 kernel: MMIO Stale Data: Unknown: No mitigations Jan 13 21:48:40.065047 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 21:48:40.065073 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 21:48:40.065087 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 21:48:40.065100 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 21:48:40.065113 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 21:48:40.065132 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Jan 13 21:48:40.065146 kernel: Freeing SMP alternatives memory: 32K Jan 13 21:48:40.065159 kernel: pid_max: default: 32768 minimum: 301 Jan 13 21:48:40.065172 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 21:48:40.065185 kernel: landlock: Up and running. Jan 13 21:48:40.065198 kernel: SELinux: Initializing. Jan 13 21:48:40.065211 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 21:48:40.065224 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 21:48:40.065238 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Jan 13 21:48:40.065251 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 13 21:48:40.065264 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 13 21:48:40.065282 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 13 21:48:40.065296 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Jan 13 21:48:40.065309 kernel: signal: max sigframe size: 1776 Jan 13 21:48:40.065323 kernel: rcu: Hierarchical SRCU implementation. Jan 13 21:48:40.065354 kernel: rcu: Max phase no-delay instances is 400. Jan 13 21:48:40.065382 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 21:48:40.065395 kernel: smp: Bringing up secondary CPUs ... Jan 13 21:48:40.065408 kernel: smpboot: x86: Booting SMP configuration: Jan 13 21:48:40.065420 kernel: .... node #0, CPUs: #1 Jan 13 21:48:40.065438 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 21:48:40.065451 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 21:48:40.065464 kernel: smpboot: Max logical packages: 16 Jan 13 21:48:40.065488 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Jan 13 21:48:40.065500 kernel: devtmpfs: initialized Jan 13 21:48:40.065512 kernel: x86/mm: Memory block size: 128MB Jan 13 21:48:40.065524 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 21:48:40.065548 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Jan 13 21:48:40.065561 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 21:48:40.065577 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 21:48:40.065590 kernel: audit: initializing netlink subsys (disabled) Jan 13 21:48:40.065603 kernel: audit: type=2000 audit(1736804918.219:1): state=initialized audit_enabled=0 res=1 Jan 13 21:48:40.065615 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 21:48:40.065647 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 21:48:40.065659 kernel: cpuidle: using governor menu Jan 13 21:48:40.065672 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 21:48:40.065698 kernel: dca service started, version 1.12.1 Jan 13 21:48:40.065712 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Jan 13 21:48:40.065730 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Jan 13 21:48:40.065744 kernel: PCI: Using configuration type 1 for base access Jan 13 21:48:40.065757 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 21:48:40.065770 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 21:48:40.065784 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 21:48:40.065797 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 21:48:40.065810 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 21:48:40.065823 kernel: ACPI: Added _OSI(Module Device) Jan 13 21:48:40.065836 kernel: ACPI: Added _OSI(Processor Device) Jan 13 21:48:40.065854 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 21:48:40.065867 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 21:48:40.065881 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 21:48:40.065894 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 21:48:40.065908 kernel: ACPI: Interpreter enabled Jan 13 21:48:40.065921 kernel: ACPI: PM: (supports S0 S5) Jan 13 21:48:40.065934 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 21:48:40.065948 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 21:48:40.065961 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 21:48:40.065979 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 13 21:48:40.065992 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 13 21:48:40.066262 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 21:48:40.066563 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 13 21:48:40.066750 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 13 21:48:40.066771 kernel: PCI host bridge to bus 0000:00 Jan 13 21:48:40.066960 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 21:48:40.067155 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 13 21:48:40.067313 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 21:48:40.067534 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Jan 13 21:48:40.067694 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 21:48:40.067868 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Jan 13 21:48:40.068038 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 13 21:48:40.068241 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Jan 13 21:48:40.071872 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Jan 13 21:48:40.072069 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Jan 13 21:48:40.072246 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Jan 13 21:48:40.072499 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Jan 13 21:48:40.072686 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 21:48:40.072872 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.073090 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Jan 13 21:48:40.073272 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.073530 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Jan 13 21:48:40.073732 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.073923 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Jan 13 21:48:40.074126 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.074308 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Jan 13 21:48:40.074520 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.074695 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Jan 13 21:48:40.074876 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.075073 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Jan 13 21:48:40.075256 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.078147 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Jan 13 21:48:40.078359 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Jan 13 21:48:40.078556 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Jan 13 21:48:40.078739 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Jan 13 21:48:40.078925 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Jan 13 21:48:40.079135 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Jan 13 21:48:40.079340 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Jan 13 21:48:40.079536 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Jan 13 21:48:40.079728 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Jan 13 21:48:40.079911 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Jan 13 21:48:40.080108 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Jan 13 21:48:40.080283 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Jan 13 21:48:40.082012 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Jan 13 21:48:40.082209 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 13 21:48:40.082445 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Jan 13 21:48:40.082650 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Jan 13 21:48:40.082829 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Jan 13 21:48:40.083016 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Jan 13 21:48:40.083205 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Jan 13 21:48:40.083419 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Jan 13 21:48:40.083630 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Jan 13 21:48:40.083812 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 13 21:48:40.083985 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 13 21:48:40.084183 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 13 21:48:40.084456 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 21:48:40.084676 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Jan 13 21:48:40.084868 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Jan 13 21:48:40.085059 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 13 21:48:40.085235 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 13 21:48:40.085438 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Jan 13 21:48:40.085615 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Jan 13 21:48:40.085787 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 13 21:48:40.085958 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 13 21:48:40.086145 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 13 21:48:40.088426 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Jan 13 21:48:40.088647 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Jan 13 21:48:40.088840 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 13 21:48:40.089012 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 13 21:48:40.089200 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 13 21:48:40.089402 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 13 21:48:40.089581 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 13 21:48:40.089760 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 13 21:48:40.089962 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 13 21:48:40.090146 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 13 21:48:40.090317 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 13 21:48:40.090584 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 13 21:48:40.090786 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 13 21:48:40.090954 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 13 21:48:40.091152 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 13 21:48:40.091348 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 13 21:48:40.091524 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 13 21:48:40.091699 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 13 21:48:40.091882 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 13 21:48:40.092073 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 13 21:48:40.092094 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 13 21:48:40.092108 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 13 21:48:40.092122 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 21:48:40.092143 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 13 21:48:40.092157 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 13 21:48:40.092170 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 13 21:48:40.092184 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 13 21:48:40.092197 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 13 21:48:40.092211 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 13 21:48:40.092225 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 13 21:48:40.092238 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 13 21:48:40.092251 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 13 21:48:40.092270 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 13 21:48:40.092284 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 13 21:48:40.092297 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 13 21:48:40.092318 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 13 21:48:40.092347 kernel: iommu: Default domain type: Translated Jan 13 21:48:40.092361 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 21:48:40.092374 kernel: PCI: Using ACPI for IRQ routing Jan 13 21:48:40.092388 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 21:48:40.092402 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jan 13 21:48:40.092421 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Jan 13 21:48:40.092612 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 13 21:48:40.092792 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 13 21:48:40.092964 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 21:48:40.092985 kernel: vgaarb: loaded Jan 13 21:48:40.092999 kernel: clocksource: Switched to clocksource kvm-clock Jan 13 21:48:40.093013 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 21:48:40.093039 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 21:48:40.093053 kernel: pnp: PnP ACPI init Jan 13 21:48:40.093244 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jan 13 21:48:40.093266 kernel: pnp: PnP ACPI: found 5 devices Jan 13 21:48:40.093281 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 21:48:40.093294 kernel: NET: Registered PF_INET protocol family Jan 13 21:48:40.093308 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 21:48:40.093335 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 21:48:40.093351 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 21:48:40.093365 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 21:48:40.093386 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 21:48:40.093400 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 21:48:40.093413 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 21:48:40.093427 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 21:48:40.093440 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 21:48:40.093454 kernel: NET: Registered PF_XDP protocol family Jan 13 21:48:40.093626 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Jan 13 21:48:40.093822 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 13 21:48:40.094006 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 13 21:48:40.094206 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 13 21:48:40.094407 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 21:48:40.094577 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 21:48:40.094744 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 21:48:40.094930 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 21:48:40.095143 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Jan 13 21:48:40.095313 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Jan 13 21:48:40.095544 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Jan 13 21:48:40.095717 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Jan 13 21:48:40.095907 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Jan 13 21:48:40.096106 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Jan 13 21:48:40.096281 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Jan 13 21:48:40.096484 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Jan 13 21:48:40.096705 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 13 21:48:40.096907 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 13 21:48:40.097125 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 13 21:48:40.097300 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 13 21:48:40.097496 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 13 21:48:40.097667 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 13 21:48:40.097835 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 13 21:48:40.098003 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 13 21:48:40.098194 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 13 21:48:40.098392 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 13 21:48:40.098567 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 13 21:48:40.098740 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 13 21:48:40.098917 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 13 21:48:40.099118 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 13 21:48:40.099301 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 13 21:48:40.099589 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 13 21:48:40.099758 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 13 21:48:40.099927 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 13 21:48:40.100112 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 13 21:48:40.100282 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 13 21:48:40.100479 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 13 21:48:40.100648 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 13 21:48:40.100817 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 13 21:48:40.100996 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 13 21:48:40.101178 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 13 21:48:40.101371 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 13 21:48:40.101542 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 13 21:48:40.101710 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 13 21:48:40.101891 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 13 21:48:40.102076 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 13 21:48:40.102248 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 13 21:48:40.102440 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 13 21:48:40.102636 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 13 21:48:40.102807 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 13 21:48:40.102969 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 13 21:48:40.103140 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 13 21:48:40.103297 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 13 21:48:40.103497 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Jan 13 21:48:40.103664 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jan 13 21:48:40.103819 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Jan 13 21:48:40.103993 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 13 21:48:40.104173 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Jan 13 21:48:40.104389 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jan 13 21:48:40.104565 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Jan 13 21:48:40.104746 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Jan 13 21:48:40.104907 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Jan 13 21:48:40.105083 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 13 21:48:40.105255 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Jan 13 21:48:40.105448 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Jan 13 21:48:40.105648 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 13 21:48:40.105833 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Jan 13 21:48:40.105996 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Jan 13 21:48:40.106174 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 13 21:48:40.106372 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Jan 13 21:48:40.106538 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Jan 13 21:48:40.106731 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 13 21:48:40.106911 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Jan 13 21:48:40.107100 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Jan 13 21:48:40.107264 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 13 21:48:40.107522 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Jan 13 21:48:40.107722 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Jan 13 21:48:40.107894 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 13 21:48:40.108090 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Jan 13 21:48:40.108254 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Jan 13 21:48:40.108455 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 13 21:48:40.108477 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 13 21:48:40.108492 kernel: PCI: CLS 0 bytes, default 64 Jan 13 21:48:40.108506 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 13 21:48:40.108532 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Jan 13 21:48:40.108554 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 21:48:40.108569 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 13 21:48:40.108596 kernel: Initialise system trusted keyrings Jan 13 21:48:40.108621 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 21:48:40.108635 kernel: Key type asymmetric registered Jan 13 21:48:40.108649 kernel: Asymmetric key parser 'x509' registered Jan 13 21:48:40.108662 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 21:48:40.108676 kernel: io scheduler mq-deadline registered Jan 13 21:48:40.108701 kernel: io scheduler kyber registered Jan 13 21:48:40.108715 kernel: io scheduler bfq registered Jan 13 21:48:40.108886 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 13 21:48:40.109079 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 13 21:48:40.109263 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.109513 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 13 21:48:40.109739 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 13 21:48:40.109911 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.110098 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 13 21:48:40.110269 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 13 21:48:40.110468 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.110639 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 13 21:48:40.110810 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 13 21:48:40.110990 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.111184 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 13 21:48:40.111380 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 13 21:48:40.111563 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.111742 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 13 21:48:40.111913 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 13 21:48:40.112101 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.112276 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 13 21:48:40.112489 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 13 21:48:40.112682 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.112886 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 13 21:48:40.113075 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 13 21:48:40.113248 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 13 21:48:40.113270 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 21:48:40.113285 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 13 21:48:40.113307 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 13 21:48:40.113335 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 21:48:40.113364 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 21:48:40.113378 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 13 21:48:40.113391 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 21:48:40.113418 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 21:48:40.113432 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 21:48:40.113609 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 13 21:48:40.113774 kernel: rtc_cmos 00:03: registered as rtc0 Jan 13 21:48:40.113959 kernel: rtc_cmos 00:03: setting system clock to 2025-01-13T21:48:39 UTC (1736804919) Jan 13 21:48:40.114134 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 13 21:48:40.114156 kernel: intel_pstate: CPU model not supported Jan 13 21:48:40.114170 kernel: NET: Registered PF_INET6 protocol family Jan 13 21:48:40.114185 kernel: Segment Routing with IPv6 Jan 13 21:48:40.114199 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 21:48:40.114213 kernel: NET: Registered PF_PACKET protocol family Jan 13 21:48:40.114233 kernel: Key type dns_resolver registered Jan 13 21:48:40.114252 kernel: IPI shorthand broadcast: enabled Jan 13 21:48:40.114266 kernel: sched_clock: Marking stable (1295003583, 240975620)->(1667301681, -131322478) Jan 13 21:48:40.114280 kernel: registered taskstats version 1 Jan 13 21:48:40.114294 kernel: Loading compiled-in X.509 certificates Jan 13 21:48:40.114309 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: ede78b3e719729f95eaaf7cb6a5289b567f6ee3e' Jan 13 21:48:40.114365 kernel: Key type .fscrypt registered Jan 13 21:48:40.114383 kernel: Key type fscrypt-provisioning registered Jan 13 21:48:40.114398 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 21:48:40.114412 kernel: ima: Allocated hash algorithm: sha1 Jan 13 21:48:40.114433 kernel: ima: No architecture policies found Jan 13 21:48:40.114447 kernel: clk: Disabling unused clocks Jan 13 21:48:40.114461 kernel: Freeing unused kernel image (initmem) memory: 43320K Jan 13 21:48:40.114475 kernel: Write protecting the kernel read-only data: 38912k Jan 13 21:48:40.114490 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Jan 13 21:48:40.114504 kernel: Run /init as init process Jan 13 21:48:40.114518 kernel: with arguments: Jan 13 21:48:40.114532 kernel: /init Jan 13 21:48:40.114546 kernel: with environment: Jan 13 21:48:40.114565 kernel: HOME=/ Jan 13 21:48:40.114579 kernel: TERM=linux Jan 13 21:48:40.114593 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 21:48:40.114619 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 21:48:40.114638 systemd[1]: Detected virtualization kvm. Jan 13 21:48:40.114653 systemd[1]: Detected architecture x86-64. Jan 13 21:48:40.114668 systemd[1]: Running in initrd. Jan 13 21:48:40.114683 systemd[1]: No hostname configured, using default hostname. Jan 13 21:48:40.114704 systemd[1]: Hostname set to . Jan 13 21:48:40.114719 systemd[1]: Initializing machine ID from VM UUID. Jan 13 21:48:40.114742 systemd[1]: Queued start job for default target initrd.target. Jan 13 21:48:40.114757 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:48:40.114773 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:48:40.114796 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 21:48:40.114812 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 21:48:40.114827 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 21:48:40.114847 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 21:48:40.114865 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 21:48:40.114880 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 21:48:40.114895 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:48:40.114911 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:48:40.114925 systemd[1]: Reached target paths.target - Path Units. Jan 13 21:48:40.114945 systemd[1]: Reached target slices.target - Slice Units. Jan 13 21:48:40.114960 systemd[1]: Reached target swap.target - Swaps. Jan 13 21:48:40.114975 systemd[1]: Reached target timers.target - Timer Units. Jan 13 21:48:40.114990 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 21:48:40.115005 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 21:48:40.115020 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 21:48:40.115047 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 21:48:40.115062 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:48:40.115078 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 21:48:40.115099 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:48:40.115114 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 21:48:40.115129 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 21:48:40.115144 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 21:48:40.115159 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 21:48:40.115174 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 21:48:40.115189 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 21:48:40.115204 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 21:48:40.115263 systemd-journald[202]: Collecting audit messages is disabled. Jan 13 21:48:40.115303 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:48:40.115332 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 21:48:40.115362 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:48:40.115377 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 21:48:40.115419 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 21:48:40.115441 systemd-journald[202]: Journal started Jan 13 21:48:40.115478 systemd-journald[202]: Runtime Journal (/run/log/journal/dac6e42b76cb4c8fbb4239e91d6757a3) is 4.7M, max 37.9M, 33.2M free. Jan 13 21:48:40.059220 systemd-modules-load[203]: Inserted module 'overlay' Jan 13 21:48:40.185950 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 21:48:40.185987 kernel: Bridge firewalling registered Jan 13 21:48:40.125965 systemd-modules-load[203]: Inserted module 'br_netfilter' Jan 13 21:48:40.200344 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 21:48:40.200918 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 21:48:40.203101 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:48:40.205219 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:48:40.215703 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:48:40.218383 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 21:48:40.222557 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 21:48:40.235092 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 21:48:40.239636 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:48:40.249755 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:48:40.254691 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:48:40.256854 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:48:40.263584 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 21:48:40.266518 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 21:48:40.283229 dracut-cmdline[236]: dracut-dracut-053 Jan 13 21:48:40.288003 dracut-cmdline[236]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8a11404d893165624d9716a125d997be53e2d6cdb0c50a945acda5b62a14eda5 Jan 13 21:48:40.318162 systemd-resolved[237]: Positive Trust Anchors: Jan 13 21:48:40.318192 systemd-resolved[237]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 21:48:40.318238 systemd-resolved[237]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 21:48:40.327620 systemd-resolved[237]: Defaulting to hostname 'linux'. Jan 13 21:48:40.330922 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 21:48:40.331807 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:48:40.394386 kernel: SCSI subsystem initialized Jan 13 21:48:40.406344 kernel: Loading iSCSI transport class v2.0-870. Jan 13 21:48:40.420670 kernel: iscsi: registered transport (tcp) Jan 13 21:48:40.446994 kernel: iscsi: registered transport (qla4xxx) Jan 13 21:48:40.447078 kernel: QLogic iSCSI HBA Driver Jan 13 21:48:40.506522 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 21:48:40.514551 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 21:48:40.545891 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 21:48:40.545979 kernel: device-mapper: uevent: version 1.0.3 Jan 13 21:48:40.546765 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 21:48:40.599437 kernel: raid6: sse2x4 gen() 12541 MB/s Jan 13 21:48:40.617378 kernel: raid6: sse2x2 gen() 8973 MB/s Jan 13 21:48:40.636078 kernel: raid6: sse2x1 gen() 9190 MB/s Jan 13 21:48:40.636119 kernel: raid6: using algorithm sse2x4 gen() 12541 MB/s Jan 13 21:48:40.655219 kernel: raid6: .... xor() 7552 MB/s, rmw enabled Jan 13 21:48:40.655316 kernel: raid6: using ssse3x2 recovery algorithm Jan 13 21:48:40.682366 kernel: xor: automatically using best checksumming function avx Jan 13 21:48:40.858369 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 21:48:40.875014 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 21:48:40.888640 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:48:40.911773 systemd-udevd[420]: Using default interface naming scheme 'v255'. Jan 13 21:48:40.919872 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:48:40.927729 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 21:48:40.950347 dracut-pre-trigger[424]: rd.md=0: removing MD RAID activation Jan 13 21:48:40.992135 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 21:48:40.997639 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 21:48:41.111829 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:48:41.124621 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 21:48:41.153736 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 21:48:41.155818 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 21:48:41.157738 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:48:41.159662 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 21:48:41.166515 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 21:48:41.199915 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 21:48:41.252387 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Jan 13 21:48:41.310526 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Jan 13 21:48:41.310761 kernel: ACPI: bus type USB registered Jan 13 21:48:41.310781 kernel: usbcore: registered new interface driver usbfs Jan 13 21:48:41.310820 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 13 21:48:41.310839 kernel: GPT:17805311 != 125829119 Jan 13 21:48:41.310868 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 13 21:48:41.310884 kernel: GPT:17805311 != 125829119 Jan 13 21:48:41.310909 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 13 21:48:41.310926 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 13 21:48:41.310943 kernel: usbcore: registered new interface driver hub Jan 13 21:48:41.310960 kernel: usbcore: registered new device driver usb Jan 13 21:48:41.311010 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 21:48:41.320483 kernel: AVX version of gcm_enc/dec engaged. Jan 13 21:48:41.326988 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 21:48:41.335984 kernel: AES CTR mode by8 optimization enabled Jan 13 21:48:41.327175 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:48:41.332420 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:48:41.336853 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 21:48:41.337049 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:48:41.339738 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:48:41.354485 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:48:41.398257 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 13 21:48:41.398572 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Jan 13 21:48:41.398795 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 13 21:48:41.399016 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 13 21:48:41.399228 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Jan 13 21:48:41.399769 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Jan 13 21:48:41.399982 kernel: hub 1-0:1.0: USB hub found Jan 13 21:48:41.400586 kernel: hub 1-0:1.0: 4 ports detected Jan 13 21:48:41.400845 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 13 21:48:41.401152 kernel: hub 2-0:1.0: USB hub found Jan 13 21:48:41.401392 kernel: libata version 3.00 loaded. Jan 13 21:48:41.401435 kernel: hub 2-0:1.0: 4 ports detected Jan 13 21:48:41.418465 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 13 21:48:41.436076 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (480) Jan 13 21:48:41.439379 kernel: BTRFS: device fsid 7f507843-6957-466b-8fb7-5bee228b170a devid 1 transid 44 /dev/vda3 scanned by (udev-worker) (471) Jan 13 21:48:41.448344 kernel: ahci 0000:00:1f.2: version 3.0 Jan 13 21:48:41.471757 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 13 21:48:41.471803 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Jan 13 21:48:41.472057 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 13 21:48:41.472267 kernel: scsi host0: ahci Jan 13 21:48:41.472515 kernel: scsi host1: ahci Jan 13 21:48:41.472729 kernel: scsi host2: ahci Jan 13 21:48:41.472926 kernel: scsi host3: ahci Jan 13 21:48:41.473137 kernel: scsi host4: ahci Jan 13 21:48:41.473373 kernel: scsi host5: ahci Jan 13 21:48:41.473576 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 Jan 13 21:48:41.473609 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 Jan 13 21:48:41.473627 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 Jan 13 21:48:41.473644 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 Jan 13 21:48:41.473662 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 Jan 13 21:48:41.473680 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 Jan 13 21:48:41.455767 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 13 21:48:41.551578 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:48:41.563975 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 13 21:48:41.564853 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 13 21:48:41.573161 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 13 21:48:41.580562 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 21:48:41.585514 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:48:41.588131 disk-uuid[566]: Primary Header is updated. Jan 13 21:48:41.588131 disk-uuid[566]: Secondary Entries is updated. Jan 13 21:48:41.588131 disk-uuid[566]: Secondary Header is updated. Jan 13 21:48:41.595364 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 13 21:48:41.604403 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 13 21:48:41.630398 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:48:41.750355 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 13 21:48:41.784923 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 13 21:48:41.784993 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 13 21:48:41.787743 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 13 21:48:41.788421 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 13 21:48:41.791353 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 13 21:48:41.793354 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 13 21:48:41.806261 kernel: usbcore: registered new interface driver usbhid Jan 13 21:48:41.806301 kernel: usbhid: USB HID core driver Jan 13 21:48:41.814124 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 13 21:48:41.814164 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Jan 13 21:48:42.610392 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 13 21:48:42.612455 disk-uuid[567]: The operation has completed successfully. Jan 13 21:48:42.666686 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 21:48:42.666876 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 21:48:42.691545 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 21:48:42.698563 sh[586]: Success Jan 13 21:48:42.716511 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Jan 13 21:48:42.782868 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 21:48:42.792492 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 21:48:42.795027 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 21:48:42.824739 kernel: BTRFS info (device dm-0): first mount of filesystem 7f507843-6957-466b-8fb7-5bee228b170a Jan 13 21:48:42.824794 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:48:42.824815 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 21:48:42.827754 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 21:48:42.829465 kernel: BTRFS info (device dm-0): using free space tree Jan 13 21:48:42.841205 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 21:48:42.842732 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 13 21:48:42.850592 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 21:48:42.853325 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 21:48:42.871551 kernel: BTRFS info (device vda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 21:48:42.871591 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:48:42.871611 kernel: BTRFS info (device vda6): using free space tree Jan 13 21:48:42.876358 kernel: BTRFS info (device vda6): auto enabling async discard Jan 13 21:48:42.889445 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 21:48:42.892340 kernel: BTRFS info (device vda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 21:48:42.899810 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 21:48:42.906575 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 21:48:43.019431 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 21:48:43.028579 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 21:48:43.061939 ignition[681]: Ignition 2.20.0 Jan 13 21:48:43.061984 ignition[681]: Stage: fetch-offline Jan 13 21:48:43.064390 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 21:48:43.062070 ignition[681]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:48:43.062090 ignition[681]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 21:48:43.062279 ignition[681]: parsed url from cmdline: "" Jan 13 21:48:43.062286 ignition[681]: no config URL provided Jan 13 21:48:43.062295 ignition[681]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 21:48:43.062310 ignition[681]: no config at "/usr/lib/ignition/user.ign" Jan 13 21:48:43.062344 ignition[681]: failed to fetch config: resource requires networking Jan 13 21:48:43.062873 ignition[681]: Ignition finished successfully Jan 13 21:48:43.076098 systemd-networkd[770]: lo: Link UP Jan 13 21:48:43.076114 systemd-networkd[770]: lo: Gained carrier Jan 13 21:48:43.079125 systemd-networkd[770]: Enumeration completed Jan 13 21:48:43.079271 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 21:48:43.080024 systemd-networkd[770]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 21:48:43.080030 systemd-networkd[770]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 21:48:43.081898 systemd-networkd[770]: eth0: Link UP Jan 13 21:48:43.081905 systemd-networkd[770]: eth0: Gained carrier Jan 13 21:48:43.081922 systemd-networkd[770]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 21:48:43.083583 systemd[1]: Reached target network.target - Network. Jan 13 21:48:43.092623 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 13 21:48:43.108477 systemd-networkd[770]: eth0: DHCPv4 address 10.230.9.94/30, gateway 10.230.9.93 acquired from 10.230.9.93 Jan 13 21:48:43.110057 ignition[777]: Ignition 2.20.0 Jan 13 21:48:43.110069 ignition[777]: Stage: fetch Jan 13 21:48:43.110349 ignition[777]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:48:43.110373 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 21:48:43.110522 ignition[777]: parsed url from cmdline: "" Jan 13 21:48:43.110529 ignition[777]: no config URL provided Jan 13 21:48:43.110557 ignition[777]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 21:48:43.110574 ignition[777]: no config at "/usr/lib/ignition/user.ign" Jan 13 21:48:43.110763 ignition[777]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 13 21:48:43.111139 ignition[777]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 13 21:48:43.111184 ignition[777]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 13 21:48:43.127057 ignition[777]: GET result: OK Jan 13 21:48:43.127865 ignition[777]: parsing config with SHA512: a2ede868c2525c23fbc17c228d93ea584b39bb1f81eb641e360091c95f6ee0d8d7736caf3b733fa1844b3989f935c42cec74c9a3dba26a669f0fe846eb88a788 Jan 13 21:48:43.133274 unknown[777]: fetched base config from "system" Jan 13 21:48:43.133295 unknown[777]: fetched base config from "system" Jan 13 21:48:43.134398 ignition[777]: fetch: fetch complete Jan 13 21:48:43.133305 unknown[777]: fetched user config from "openstack" Jan 13 21:48:43.134408 ignition[777]: fetch: fetch passed Jan 13 21:48:43.136413 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 13 21:48:43.134478 ignition[777]: Ignition finished successfully Jan 13 21:48:43.146646 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 21:48:43.165185 ignition[784]: Ignition 2.20.0 Jan 13 21:48:43.165204 ignition[784]: Stage: kargs Jan 13 21:48:43.165454 ignition[784]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:48:43.165473 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 21:48:43.168847 ignition[784]: kargs: kargs passed Jan 13 21:48:43.171670 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 21:48:43.168957 ignition[784]: Ignition finished successfully Jan 13 21:48:43.181591 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 21:48:43.197280 ignition[790]: Ignition 2.20.0 Jan 13 21:48:43.197315 ignition[790]: Stage: disks Jan 13 21:48:43.197605 ignition[790]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:48:43.199707 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 21:48:43.197625 ignition[790]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 21:48:43.201124 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 21:48:43.198480 ignition[790]: disks: disks passed Jan 13 21:48:43.202756 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 21:48:43.198557 ignition[790]: Ignition finished successfully Jan 13 21:48:43.204470 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 21:48:43.206004 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 21:48:43.207264 systemd[1]: Reached target basic.target - Basic System. Jan 13 21:48:43.220576 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 21:48:43.240613 systemd-fsck[798]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 21:48:43.247538 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 21:48:43.254435 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 21:48:43.370352 kernel: EXT4-fs (vda9): mounted filesystem 59ba8ffc-e6b0-4bb4-a36e-13a47bd6ad99 r/w with ordered data mode. Quota mode: none. Jan 13 21:48:43.370965 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 21:48:43.372371 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 21:48:43.379449 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 21:48:43.383606 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 21:48:43.385537 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 21:48:43.396558 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 13 21:48:43.409555 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (806) Jan 13 21:48:43.409593 kernel: BTRFS info (device vda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 21:48:43.409614 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:48:43.409632 kernel: BTRFS info (device vda6): using free space tree Jan 13 21:48:43.409651 kernel: BTRFS info (device vda6): auto enabling async discard Jan 13 21:48:43.407752 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 21:48:43.407807 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 21:48:43.414748 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 21:48:43.416417 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 21:48:43.425552 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 21:48:43.503083 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 21:48:43.509298 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Jan 13 21:48:43.517660 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 21:48:43.524692 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 21:48:43.634313 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 21:48:43.640472 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 21:48:43.649577 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 21:48:43.661397 kernel: BTRFS info (device vda6): last unmount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 21:48:43.688793 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 21:48:43.698557 ignition[923]: INFO : Ignition 2.20.0 Jan 13 21:48:43.698557 ignition[923]: INFO : Stage: mount Jan 13 21:48:43.700375 ignition[923]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:48:43.700375 ignition[923]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 21:48:43.700375 ignition[923]: INFO : mount: mount passed Jan 13 21:48:43.700375 ignition[923]: INFO : Ignition finished successfully Jan 13 21:48:43.701793 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 21:48:43.821351 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 21:48:44.400650 systemd-networkd[770]: eth0: Gained IPv6LL Jan 13 21:48:45.907534 systemd-networkd[770]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8257:24:19ff:fee6:95e/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8257:24:19ff:fee6:95e/64 assigned by NDisc. Jan 13 21:48:45.907550 systemd-networkd[770]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 13 21:48:50.576257 coreos-metadata[808]: Jan 13 21:48:50.576 WARN failed to locate config-drive, using the metadata service API instead Jan 13 21:48:50.600793 coreos-metadata[808]: Jan 13 21:48:50.600 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 13 21:48:50.612861 coreos-metadata[808]: Jan 13 21:48:50.612 INFO Fetch successful Jan 13 21:48:50.613712 coreos-metadata[808]: Jan 13 21:48:50.613 INFO wrote hostname srv-z907b.gb1.brightbox.com to /sysroot/etc/hostname Jan 13 21:48:50.615523 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 13 21:48:50.615746 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 13 21:48:50.636478 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 21:48:50.646278 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 21:48:50.664466 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (939) Jan 13 21:48:50.669889 kernel: BTRFS info (device vda6): first mount of filesystem de2056f8-fbde-4b85-b887-0a28f289d968 Jan 13 21:48:50.669939 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:48:50.672894 kernel: BTRFS info (device vda6): using free space tree Jan 13 21:48:50.694414 kernel: BTRFS info (device vda6): auto enabling async discard Jan 13 21:48:50.696013 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 21:48:50.731182 ignition[957]: INFO : Ignition 2.20.0 Jan 13 21:48:50.731182 ignition[957]: INFO : Stage: files Jan 13 21:48:50.732912 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:48:50.732912 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 21:48:50.732912 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Jan 13 21:48:50.735911 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 21:48:50.735911 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 21:48:50.738420 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 21:48:50.738420 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 21:48:50.740437 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 21:48:50.739563 unknown[957]: wrote ssh authorized keys file for user: core Jan 13 21:48:50.742539 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/home/core/install.sh" Jan 13 21:48:50.742539 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 21:48:50.742539 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 21:48:50.746366 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 21:48:50.746366 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 21:48:50.746366 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 21:48:50.746366 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 21:48:50.746366 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Jan 13 21:48:51.310993 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Jan 13 21:48:53.923409 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 13 21:48:53.928391 ignition[957]: INFO : files: createResultFile: createFiles: op(7): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 21:48:53.928391 ignition[957]: INFO : files: createResultFile: createFiles: op(7): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 21:48:53.928391 ignition[957]: INFO : files: files passed Jan 13 21:48:53.928391 ignition[957]: INFO : Ignition finished successfully Jan 13 21:48:53.928879 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 21:48:53.946771 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 21:48:53.950557 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 21:48:53.956070 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 21:48:53.958452 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 21:48:53.974969 initrd-setup-root-after-ignition[986]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:48:53.977039 initrd-setup-root-after-ignition[986]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:48:53.979197 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:48:53.982179 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 21:48:53.983776 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 21:48:53.991557 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 21:48:54.039557 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 21:48:54.039771 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 21:48:54.041619 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 21:48:54.043011 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 21:48:54.044605 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 21:48:54.053586 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 21:48:54.072507 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 21:48:54.080607 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 21:48:54.096875 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:48:54.097865 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:48:54.099614 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 21:48:54.101148 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 21:48:54.101368 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 21:48:54.103279 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 21:48:54.104302 systemd[1]: Stopped target basic.target - Basic System. Jan 13 21:48:54.105961 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 21:48:54.107407 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 21:48:54.108879 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 21:48:54.110561 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 21:48:54.112127 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 21:48:54.113842 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 21:48:54.115344 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 21:48:54.116935 systemd[1]: Stopped target swap.target - Swaps. Jan 13 21:48:54.118399 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 21:48:54.118572 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 21:48:54.120407 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:48:54.121468 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:48:54.122919 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 21:48:54.123373 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:48:54.124578 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 21:48:54.124782 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 21:48:54.126840 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 21:48:54.127042 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 21:48:54.128769 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 21:48:54.128934 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 21:48:54.143778 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 21:48:54.148587 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 21:48:54.150139 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 21:48:54.150359 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:48:54.154973 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 21:48:54.155925 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 21:48:54.169392 ignition[1010]: INFO : Ignition 2.20.0 Jan 13 21:48:54.169392 ignition[1010]: INFO : Stage: umount Jan 13 21:48:54.169392 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:48:54.169392 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 13 21:48:54.176254 ignition[1010]: INFO : umount: umount passed Jan 13 21:48:54.176254 ignition[1010]: INFO : Ignition finished successfully Jan 13 21:48:54.171424 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 21:48:54.171586 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 21:48:54.175905 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 21:48:54.176065 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 21:48:54.177911 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 21:48:54.178052 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 21:48:54.180320 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 21:48:54.180422 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 21:48:54.181178 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 13 21:48:54.181254 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 13 21:48:54.182036 systemd[1]: Stopped target network.target - Network. Jan 13 21:48:54.184445 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 21:48:54.184538 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 21:48:54.186184 systemd[1]: Stopped target paths.target - Path Units. Jan 13 21:48:54.188378 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 21:48:54.192687 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:48:54.194733 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 21:48:54.196283 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 21:48:54.198223 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 21:48:54.198298 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 21:48:54.199050 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 21:48:54.199121 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 21:48:54.199846 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 21:48:54.199923 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 21:48:54.200601 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 21:48:54.200689 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 21:48:54.202766 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 21:48:54.204887 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 21:48:54.211559 systemd-networkd[770]: eth0: DHCPv6 lease lost Jan 13 21:48:54.214064 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 21:48:54.214929 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 21:48:54.215090 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 21:48:54.218105 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 21:48:54.218265 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 21:48:54.222033 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 21:48:54.222665 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:48:54.223541 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 21:48:54.223627 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 21:48:54.230505 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 21:48:54.231693 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 21:48:54.231769 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 21:48:54.234186 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:48:54.241800 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 21:48:54.241974 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 21:48:54.246279 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 21:48:54.246603 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:48:54.255808 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 21:48:54.255909 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 21:48:54.256820 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 21:48:54.256882 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:48:54.258391 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 21:48:54.258469 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 21:48:54.260739 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 21:48:54.260808 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 21:48:54.261749 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 21:48:54.261818 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:48:54.265547 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 21:48:54.266843 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 21:48:54.266923 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:48:54.269388 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 21:48:54.269469 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 21:48:54.270654 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 21:48:54.270724 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:48:54.273545 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 21:48:54.273619 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:48:54.275998 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 21:48:54.276079 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:48:54.277686 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 21:48:54.277753 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:48:54.279262 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 21:48:54.279364 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:48:54.283911 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 21:48:54.284068 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 21:48:54.289801 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 21:48:54.289941 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 21:48:54.291073 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 21:48:54.299660 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 21:48:54.311383 systemd[1]: Switching root. Jan 13 21:48:54.348419 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Jan 13 21:48:54.348533 systemd-journald[202]: Journal stopped Jan 13 21:48:55.887639 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 21:48:55.887768 kernel: SELinux: policy capability open_perms=1 Jan 13 21:48:55.887811 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 21:48:55.887842 kernel: SELinux: policy capability always_check_network=0 Jan 13 21:48:55.887892 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 21:48:55.887919 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 21:48:55.887947 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 21:48:55.887973 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 21:48:55.888000 kernel: audit: type=1403 audit(1736804934.677:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 13 21:48:55.888025 systemd[1]: Successfully loaded SELinux policy in 51.305ms. Jan 13 21:48:55.888101 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 21.672ms. Jan 13 21:48:55.888131 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 21:48:55.888161 systemd[1]: Detected virtualization kvm. Jan 13 21:48:55.888195 systemd[1]: Detected architecture x86-64. Jan 13 21:48:55.888235 systemd[1]: Detected first boot. Jan 13 21:48:55.888272 systemd[1]: Hostname set to . Jan 13 21:48:55.888296 systemd[1]: Initializing machine ID from VM UUID. Jan 13 21:48:55.888318 zram_generator::config[1052]: No configuration found. Jan 13 21:48:55.888384 systemd[1]: Populated /etc with preset unit settings. Jan 13 21:48:55.888417 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 13 21:48:55.888457 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 13 21:48:55.888481 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 13 21:48:55.888512 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 21:48:55.888552 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 21:48:55.888583 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 21:48:55.888617 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 21:48:55.888648 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 21:48:55.888672 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 21:48:55.888695 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 21:48:55.888717 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 21:48:55.888750 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:48:55.888773 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:48:55.888812 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 21:48:55.888837 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 21:48:55.888872 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 21:48:55.888893 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 21:48:55.888914 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 13 21:48:55.888953 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:48:55.888975 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 13 21:48:55.888997 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 13 21:48:55.889031 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 13 21:48:55.889055 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 21:48:55.889076 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:48:55.889117 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 21:48:55.889140 systemd[1]: Reached target slices.target - Slice Units. Jan 13 21:48:55.889162 systemd[1]: Reached target swap.target - Swaps. Jan 13 21:48:55.889198 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 21:48:55.889262 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 21:48:55.889305 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:48:55.889348 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 21:48:55.889374 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:48:55.889409 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 21:48:55.889434 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 21:48:55.889457 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 21:48:55.889483 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 21:48:55.889505 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:48:55.889542 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 21:48:55.889566 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 21:48:55.889605 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 21:48:55.889638 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 21:48:55.889662 systemd[1]: Reached target machines.target - Containers. Jan 13 21:48:55.889684 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 21:48:55.889706 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 21:48:55.889728 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 21:48:55.889750 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 21:48:55.889787 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 21:48:55.889818 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 21:48:55.889863 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 21:48:55.889902 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 21:48:55.889926 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 21:48:55.889965 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 21:48:55.889996 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 13 21:48:55.890024 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 13 21:48:55.890048 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 13 21:48:55.890070 systemd[1]: Stopped systemd-fsck-usr.service. Jan 13 21:48:55.890092 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 21:48:55.890114 kernel: loop: module loaded Jan 13 21:48:55.890136 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 21:48:55.890165 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 21:48:55.890206 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 21:48:55.890236 kernel: fuse: init (API version 7.39) Jan 13 21:48:55.890259 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 21:48:55.890288 kernel: ACPI: bus type drm_connector registered Jan 13 21:48:55.890312 systemd[1]: verity-setup.service: Deactivated successfully. Jan 13 21:48:55.890361 systemd[1]: Stopped verity-setup.service. Jan 13 21:48:55.890388 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:48:55.890410 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 21:48:55.890433 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 21:48:55.890470 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 21:48:55.890495 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 21:48:55.890524 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 21:48:55.890579 systemd-journald[1148]: Collecting audit messages is disabled. Jan 13 21:48:55.890668 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 21:48:55.890699 systemd-journald[1148]: Journal started Jan 13 21:48:55.890731 systemd-journald[1148]: Runtime Journal (/run/log/journal/dac6e42b76cb4c8fbb4239e91d6757a3) is 4.7M, max 37.9M, 33.2M free. Jan 13 21:48:55.472732 systemd[1]: Queued start job for default target multi-user.target. Jan 13 21:48:55.497582 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 13 21:48:55.498287 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 13 21:48:55.893465 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 21:48:55.894996 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 21:48:55.896150 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:48:55.897421 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 21:48:55.897689 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 21:48:55.899130 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 21:48:55.899407 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 21:48:55.900668 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 21:48:55.900892 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 21:48:55.902011 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 21:48:55.902257 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 21:48:55.903629 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 21:48:55.903844 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 21:48:55.905089 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 21:48:55.905288 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 21:48:55.906654 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 21:48:55.907765 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 21:48:55.908995 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 21:48:55.922938 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 21:48:55.934427 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 21:48:55.938040 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 21:48:55.944870 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 21:48:55.944926 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 21:48:55.947022 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 13 21:48:55.955558 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 21:48:55.964464 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 21:48:55.965443 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 21:48:55.973635 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 21:48:55.978064 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 21:48:55.979443 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 21:48:55.984707 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 21:48:55.985856 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 21:48:55.997568 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 21:48:56.002549 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 21:48:56.006436 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 21:48:56.012137 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 21:48:56.014097 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 21:48:56.016852 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 21:48:56.042507 systemd-journald[1148]: Time spent on flushing to /var/log/journal/dac6e42b76cb4c8fbb4239e91d6757a3 is 24.273ms for 1123 entries. Jan 13 21:48:56.042507 systemd-journald[1148]: System Journal (/var/log/journal/dac6e42b76cb4c8fbb4239e91d6757a3) is 8.0M, max 584.8M, 576.8M free. Jan 13 21:48:56.084451 systemd-journald[1148]: Received client request to flush runtime journal. Jan 13 21:48:56.071526 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 21:48:56.074840 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 21:48:56.084623 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 13 21:48:56.101634 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 21:48:56.122736 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 21:48:56.131364 kernel: loop0: detected capacity change from 0 to 8 Jan 13 21:48:56.125628 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 13 21:48:56.150453 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 21:48:56.170160 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:48:56.176397 kernel: loop1: detected capacity change from 0 to 205544 Jan 13 21:48:56.179027 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Jan 13 21:48:56.181380 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Jan 13 21:48:56.196963 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:48:56.205526 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 21:48:56.246352 kernel: loop2: detected capacity change from 0 to 141000 Jan 13 21:48:56.271900 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:48:56.281531 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 13 21:48:56.326345 kernel: loop3: detected capacity change from 0 to 138184 Jan 13 21:48:56.328422 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 21:48:56.347536 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 21:48:56.358231 udevadm[1207]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 13 21:48:56.384359 kernel: loop4: detected capacity change from 0 to 8 Jan 13 21:48:56.400644 kernel: loop5: detected capacity change from 0 to 205544 Jan 13 21:48:56.415527 systemd-tmpfiles[1210]: ACLs are not supported, ignoring. Jan 13 21:48:56.415556 systemd-tmpfiles[1210]: ACLs are not supported, ignoring. Jan 13 21:48:56.435522 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:48:56.450922 kernel: loop6: detected capacity change from 0 to 141000 Jan 13 21:48:56.502604 kernel: loop7: detected capacity change from 0 to 138184 Jan 13 21:48:56.532668 (sd-merge)[1213]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Jan 13 21:48:56.534033 (sd-merge)[1213]: Merged extensions into '/usr'. Jan 13 21:48:56.547813 systemd[1]: Reloading requested from client PID 1185 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 21:48:56.547846 systemd[1]: Reloading... Jan 13 21:48:56.728402 zram_generator::config[1240]: No configuration found. Jan 13 21:48:56.776735 ldconfig[1180]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 21:48:56.968362 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 21:48:57.041583 systemd[1]: Reloading finished in 492 ms. Jan 13 21:48:57.082219 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 21:48:57.084072 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 21:48:57.098643 systemd[1]: Starting ensure-sysext.service... Jan 13 21:48:57.103720 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 21:48:57.115654 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 21:48:57.126510 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:48:57.127971 systemd[1]: Reloading requested from client PID 1296 ('systemctl') (unit ensure-sysext.service)... Jan 13 21:48:57.127986 systemd[1]: Reloading... Jan 13 21:48:57.142914 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 21:48:57.143470 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 13 21:48:57.144983 systemd-tmpfiles[1297]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 13 21:48:57.145478 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Jan 13 21:48:57.145617 systemd-tmpfiles[1297]: ACLs are not supported, ignoring. Jan 13 21:48:57.151616 systemd-tmpfiles[1297]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 21:48:57.151634 systemd-tmpfiles[1297]: Skipping /boot Jan 13 21:48:57.170790 systemd-tmpfiles[1297]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 21:48:57.170810 systemd-tmpfiles[1297]: Skipping /boot Jan 13 21:48:57.212139 systemd-udevd[1299]: Using default interface naming scheme 'v255'. Jan 13 21:48:57.245407 zram_generator::config[1325]: No configuration found. Jan 13 21:48:57.434366 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 44 scanned by (udev-worker) (1332) Jan 13 21:48:57.505170 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 21:48:57.621372 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 13 21:48:57.624284 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 13 21:48:57.625214 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 13 21:48:57.625795 systemd[1]: Reloading finished in 497 ms. Jan 13 21:48:57.651266 kernel: ACPI: button: Power Button [PWRF] Jan 13 21:48:57.650496 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:48:57.658366 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 21:48:57.660500 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:48:57.695343 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 13 21:48:57.702927 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Jan 13 21:48:57.703256 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 13 21:48:57.706284 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:48:57.713382 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 13 21:48:57.716885 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 21:48:57.721752 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 21:48:57.723644 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 21:48:57.734627 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 21:48:57.745483 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 21:48:57.761661 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 21:48:57.763843 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 21:48:57.770677 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 21:48:57.780234 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 21:48:57.785056 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 21:48:57.793955 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 21:48:57.801631 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 21:48:57.802776 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:48:57.808918 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:48:57.809262 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 13 21:48:57.817681 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 21:48:57.820633 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 21:48:57.820845 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:48:57.829164 systemd[1]: Finished ensure-sysext.service. Jan 13 21:48:57.830405 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 21:48:57.830649 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 21:48:57.837131 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 13 21:48:57.838463 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 21:48:57.838722 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 21:48:57.839971 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 21:48:57.840169 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 21:48:57.842053 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 21:48:57.842149 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 21:48:57.853511 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 21:48:57.853766 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 21:48:57.870658 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 21:48:57.872971 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 21:48:57.886404 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 21:48:57.911667 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 21:48:57.930373 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 21:48:57.939275 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:48:57.965511 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 21:48:57.967879 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 21:48:57.978243 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 21:48:58.025820 augenrules[1452]: No rules Jan 13 21:48:58.027793 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 21:48:58.028187 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 21:48:58.029165 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 21:48:58.225295 systemd-networkd[1419]: lo: Link UP Jan 13 21:48:58.225308 systemd-networkd[1419]: lo: Gained carrier Jan 13 21:48:58.227627 systemd-networkd[1419]: Enumeration completed Jan 13 21:48:58.227762 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 21:48:58.228363 systemd-networkd[1419]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 21:48:58.228369 systemd-networkd[1419]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 13 21:48:58.230086 systemd-networkd[1419]: eth0: Link UP Jan 13 21:48:58.230100 systemd-networkd[1419]: eth0: Gained carrier Jan 13 21:48:58.230119 systemd-networkd[1419]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 13 21:48:58.253406 systemd-networkd[1419]: eth0: DHCPv4 address 10.230.9.94/30, gateway 10.230.9.93 acquired from 10.230.9.93 Jan 13 21:48:58.254676 systemd-timesyncd[1428]: Network configuration changed, trying to establish connection. Jan 13 21:48:58.279085 systemd-resolved[1420]: Positive Trust Anchors: Jan 13 21:48:58.279637 systemd-resolved[1420]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 21:48:58.279763 systemd-resolved[1420]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 21:48:58.286160 systemd-resolved[1420]: Using system hostname 'srv-z907b.gb1.brightbox.com'. Jan 13 21:48:58.302306 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 13 21:48:58.303398 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 21:48:58.304688 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 13 21:48:58.305972 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:48:58.308752 systemd[1]: Reached target network.target - Network. Jan 13 21:48:58.309446 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:48:58.310221 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 21:48:58.318634 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 13 21:48:58.321125 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 21:48:58.341631 lvm[1472]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 21:48:58.382142 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 13 21:48:58.384083 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:48:58.385130 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 21:48:58.386171 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 21:48:58.387191 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 21:48:58.388460 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 21:48:58.389421 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 21:48:58.390254 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 21:48:58.391053 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 21:48:58.391117 systemd[1]: Reached target paths.target - Path Units. Jan 13 21:48:58.391783 systemd[1]: Reached target timers.target - Timer Units. Jan 13 21:48:58.393953 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 21:48:58.396562 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 21:48:58.403586 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 21:48:58.406212 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 13 21:48:58.407832 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 21:48:58.408704 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 21:48:58.409442 systemd[1]: Reached target basic.target - Basic System. Jan 13 21:48:58.410158 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 21:48:58.410203 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 21:48:58.413493 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 21:48:58.422519 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 13 21:48:58.423923 lvm[1477]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 21:48:58.428566 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 21:48:58.433470 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 21:48:58.438598 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 21:48:58.440431 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 21:48:58.449561 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 21:48:58.460572 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 21:48:58.468708 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 21:48:58.481581 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 21:48:58.483212 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 21:48:58.484929 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 21:48:58.487571 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 21:48:58.490974 jq[1481]: false Jan 13 21:48:58.498465 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 21:48:58.502760 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 13 21:48:58.506491 dbus-daemon[1480]: [system] SELinux support is enabled Jan 13 21:48:58.509955 dbus-daemon[1480]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1419 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Jan 13 21:48:58.512636 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 21:48:58.518907 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 21:48:58.519190 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 21:48:58.530824 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 21:48:58.531099 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 21:48:58.535499 extend-filesystems[1483]: Found loop4 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found loop5 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found loop6 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found loop7 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda1 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda2 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda3 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found usr Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda4 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda6 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda7 Jan 13 21:48:58.540266 extend-filesystems[1483]: Found vda9 Jan 13 21:48:58.540266 extend-filesystems[1483]: Checking size of /dev/vda9 Jan 13 21:48:58.537773 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.systemd1' Jan 13 21:48:58.535625 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 21:48:58.554241 jq[1492]: true Jan 13 21:48:58.535675 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 21:48:58.538763 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 21:48:58.538794 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 21:48:58.570625 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Jan 13 21:48:58.574145 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 21:48:58.574457 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 21:48:58.602679 extend-filesystems[1483]: Resized partition /dev/vda9 Jan 13 21:48:58.611028 (ntainerd)[1511]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 13 21:48:58.614831 extend-filesystems[1516]: resize2fs 1.47.1 (20-May-2024) Jan 13 21:48:58.618478 jq[1506]: true Jan 13 21:48:58.634403 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Jan 13 21:48:58.668369 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 44 scanned by (udev-worker) (1333) Jan 13 21:48:58.686150 update_engine[1491]: I20250113 21:48:58.685516 1491 main.cc:92] Flatcar Update Engine starting Jan 13 21:48:58.750452 update_engine[1491]: I20250113 21:48:58.710549 1491 update_check_scheduler.cc:74] Next update check in 4m33s Jan 13 21:48:58.709191 systemd[1]: Started update-engine.service - Update Engine. Jan 13 21:48:58.717626 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 21:48:58.826784 systemd-timesyncd[1428]: Contacted time server 46.101.52.249:123 (0.flatcar.pool.ntp.org). Jan 13 21:48:58.829665 systemd-timesyncd[1428]: Initial clock synchronization to Mon 2025-01-13 21:48:59.104547 UTC. Jan 13 21:48:58.836168 bash[1532]: Updated "/home/core/.ssh/authorized_keys" Jan 13 21:48:58.838525 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 21:48:58.841057 systemd-logind[1488]: Watching system buttons on /dev/input/event2 (Power Button) Jan 13 21:48:58.841107 systemd-logind[1488]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 13 21:48:58.849551 systemd[1]: Starting sshkeys.service... Jan 13 21:48:58.853114 systemd-logind[1488]: New seat seat0. Jan 13 21:48:58.859821 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 21:48:58.933009 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 13 21:48:58.944779 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 13 21:48:58.980274 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.hostname1' Jan 13 21:48:58.980720 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Jan 13 21:48:58.982154 dbus-daemon[1480]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1510 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Jan 13 21:48:58.986450 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Jan 13 21:48:58.995747 systemd[1]: Starting polkit.service - Authorization Manager... Jan 13 21:48:59.016319 extend-filesystems[1516]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 13 21:48:59.016319 extend-filesystems[1516]: old_desc_blocks = 1, new_desc_blocks = 8 Jan 13 21:48:59.016319 extend-filesystems[1516]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Jan 13 21:48:59.015227 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 21:48:59.028094 extend-filesystems[1483]: Resized filesystem in /dev/vda9 Jan 13 21:48:59.015502 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 21:48:59.035657 polkitd[1547]: Started polkitd version 121 Jan 13 21:48:59.059751 polkitd[1547]: Loading rules from directory /etc/polkit-1/rules.d Jan 13 21:48:59.059853 polkitd[1547]: Loading rules from directory /usr/share/polkit-1/rules.d Jan 13 21:48:59.061326 locksmithd[1534]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 21:48:59.065840 polkitd[1547]: Finished loading, compiling and executing 2 rules Jan 13 21:48:59.069105 dbus-daemon[1480]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Jan 13 21:48:59.069396 systemd[1]: Started polkit.service - Authorization Manager. Jan 13 21:48:59.069881 polkitd[1547]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Jan 13 21:48:59.104261 systemd-hostnamed[1510]: Hostname set to (static) Jan 13 21:48:59.145738 containerd[1511]: time="2025-01-13T21:48:59.145605187Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 13 21:48:59.194905 containerd[1511]: time="2025-01-13T21:48:59.194811440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.199601418Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.199646703Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.199673564Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.199979303Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.200005696Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.200142627Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.200164117Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.200465100Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.200520554Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.200543540Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201387 containerd[1511]: time="2025-01-13T21:48:59.200560660Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201770 containerd[1511]: time="2025-01-13T21:48:59.200689757Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201770 containerd[1511]: time="2025-01-13T21:48:59.201116360Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201770 containerd[1511]: time="2025-01-13T21:48:59.201248875Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:48:59.201770 containerd[1511]: time="2025-01-13T21:48:59.201299481Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 13 21:48:59.202125 containerd[1511]: time="2025-01-13T21:48:59.202098460Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 13 21:48:59.202325 containerd[1511]: time="2025-01-13T21:48:59.202274348Z" level=info msg="metadata content store policy set" policy=shared Jan 13 21:48:59.206479 containerd[1511]: time="2025-01-13T21:48:59.206437994Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 13 21:48:59.206656 containerd[1511]: time="2025-01-13T21:48:59.206628523Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 13 21:48:59.206802 containerd[1511]: time="2025-01-13T21:48:59.206776892Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 13 21:48:59.206916 containerd[1511]: time="2025-01-13T21:48:59.206892584Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 13 21:48:59.207014 containerd[1511]: time="2025-01-13T21:48:59.206991024Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 13 21:48:59.207301 containerd[1511]: time="2025-01-13T21:48:59.207265093Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 13 21:48:59.207724 containerd[1511]: time="2025-01-13T21:48:59.207690808Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 13 21:48:59.208066 containerd[1511]: time="2025-01-13T21:48:59.208038062Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 13 21:48:59.208205 containerd[1511]: time="2025-01-13T21:48:59.208180993Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 13 21:48:59.208325 containerd[1511]: time="2025-01-13T21:48:59.208294746Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 13 21:48:59.208436 containerd[1511]: time="2025-01-13T21:48:59.208413342Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.208584 containerd[1511]: time="2025-01-13T21:48:59.208559381Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.208680 containerd[1511]: time="2025-01-13T21:48:59.208657168Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.208777 containerd[1511]: time="2025-01-13T21:48:59.208753003Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.208916 containerd[1511]: time="2025-01-13T21:48:59.208890868Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.209038 containerd[1511]: time="2025-01-13T21:48:59.209013817Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.209142 containerd[1511]: time="2025-01-13T21:48:59.209119740Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.209258 containerd[1511]: time="2025-01-13T21:48:59.209233975Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 13 21:48:59.209387 containerd[1511]: time="2025-01-13T21:48:59.209349159Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.209513 containerd[1511]: time="2025-01-13T21:48:59.209491232Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.209651 containerd[1511]: time="2025-01-13T21:48:59.209627238Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.209763 containerd[1511]: time="2025-01-13T21:48:59.209738965Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.209873 containerd[1511]: time="2025-01-13T21:48:59.209848265Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.209978 containerd[1511]: time="2025-01-13T21:48:59.209955184Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210071 containerd[1511]: time="2025-01-13T21:48:59.210049055Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210179 containerd[1511]: time="2025-01-13T21:48:59.210155585Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210251696Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210300039Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210321721Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210355018Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210374102Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210426821Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210461339Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210484124Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.210599 containerd[1511]: time="2025-01-13T21:48:59.210512901Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 13 21:48:59.211013 containerd[1511]: time="2025-01-13T21:48:59.210980744Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 13 21:48:59.211441 containerd[1511]: time="2025-01-13T21:48:59.211221367Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 13 21:48:59.211441 containerd[1511]: time="2025-01-13T21:48:59.211248603Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 13 21:48:59.211441 containerd[1511]: time="2025-01-13T21:48:59.211284641Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 13 21:48:59.211441 containerd[1511]: time="2025-01-13T21:48:59.211300806Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.211441 containerd[1511]: time="2025-01-13T21:48:59.211320056Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 13 21:48:59.211441 containerd[1511]: time="2025-01-13T21:48:59.211359626Z" level=info msg="NRI interface is disabled by configuration." Jan 13 21:48:59.211441 containerd[1511]: time="2025-01-13T21:48:59.211405706Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 13 21:48:59.212392 containerd[1511]: time="2025-01-13T21:48:59.212141735Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 13 21:48:59.212392 containerd[1511]: time="2025-01-13T21:48:59.212226903Z" level=info msg="Connect containerd service" Jan 13 21:48:59.212392 containerd[1511]: time="2025-01-13T21:48:59.212285484Z" level=info msg="using legacy CRI server" Jan 13 21:48:59.212392 containerd[1511]: time="2025-01-13T21:48:59.212301015Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 21:48:59.213382 containerd[1511]: time="2025-01-13T21:48:59.212931911Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 13 21:48:59.214190 containerd[1511]: time="2025-01-13T21:48:59.214158061Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 21:48:59.214471 containerd[1511]: time="2025-01-13T21:48:59.214414317Z" level=info msg="Start subscribing containerd event" Jan 13 21:48:59.215250 containerd[1511]: time="2025-01-13T21:48:59.214566312Z" level=info msg="Start recovering state" Jan 13 21:48:59.215250 containerd[1511]: time="2025-01-13T21:48:59.214676170Z" level=info msg="Start event monitor" Jan 13 21:48:59.215250 containerd[1511]: time="2025-01-13T21:48:59.214706071Z" level=info msg="Start snapshots syncer" Jan 13 21:48:59.215250 containerd[1511]: time="2025-01-13T21:48:59.214733590Z" level=info msg="Start cni network conf syncer for default" Jan 13 21:48:59.215250 containerd[1511]: time="2025-01-13T21:48:59.214748759Z" level=info msg="Start streaming server" Jan 13 21:48:59.215743 containerd[1511]: time="2025-01-13T21:48:59.215719585Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 21:48:59.215936 containerd[1511]: time="2025-01-13T21:48:59.215910804Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 21:48:59.216132 containerd[1511]: time="2025-01-13T21:48:59.216110685Z" level=info msg="containerd successfully booted in 0.072040s" Jan 13 21:48:59.216237 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 21:48:59.333517 sshd_keygen[1508]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 21:48:59.362236 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 21:48:59.369861 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 21:48:59.382566 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 21:48:59.382841 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 21:48:59.389784 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 21:48:59.416348 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 21:48:59.426904 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 21:48:59.429921 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 13 21:48:59.431053 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 21:48:59.501306 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 21:48:59.509807 systemd[1]: Started sshd@0-10.230.9.94:22-139.178.68.195:42278.service - OpenSSH per-connection server daemon (139.178.68.195:42278). Jan 13 21:48:59.698894 systemd-networkd[1419]: eth0: Gained IPv6LL Jan 13 21:48:59.702787 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 21:48:59.705725 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 21:48:59.713755 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:48:59.718772 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 21:48:59.750825 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 21:49:00.462653 sshd[1580]: Accepted publickey for core from 139.178.68.195 port 42278 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:00.466037 sshd-session[1580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:00.483755 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 21:49:00.492866 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 21:49:00.506160 systemd-logind[1488]: New session 1 of user core. Jan 13 21:49:00.517892 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 21:49:00.530272 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 21:49:00.537734 (systemd)[1597]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 13 21:49:00.645557 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:49:00.656862 (kubelet)[1607]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:49:00.681115 systemd[1597]: Queued start job for default target default.target. Jan 13 21:49:00.689799 systemd[1597]: Created slice app.slice - User Application Slice. Jan 13 21:49:00.689842 systemd[1597]: Reached target paths.target - Paths. Jan 13 21:49:00.689867 systemd[1597]: Reached target timers.target - Timers. Jan 13 21:49:00.693531 systemd[1597]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 21:49:00.715593 systemd[1597]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 21:49:00.716664 systemd[1597]: Reached target sockets.target - Sockets. Jan 13 21:49:00.716701 systemd[1597]: Reached target basic.target - Basic System. Jan 13 21:49:00.716777 systemd[1597]: Reached target default.target - Main User Target. Jan 13 21:49:00.716858 systemd[1597]: Startup finished in 168ms. Jan 13 21:49:00.717429 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 21:49:00.726772 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 21:49:00.971500 systemd-networkd[1419]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8257:24:19ff:fee6:95e/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8257:24:19ff:fee6:95e/64 assigned by NDisc. Jan 13 21:49:00.971513 systemd-networkd[1419]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 13 21:49:01.270580 kubelet[1607]: E0113 21:49:01.270341 1607 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:49:01.273574 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:49:01.273885 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:49:01.382854 systemd[1]: Started sshd@1-10.230.9.94:22-139.178.68.195:42282.service - OpenSSH per-connection server daemon (139.178.68.195:42282). Jan 13 21:49:02.291610 sshd[1621]: Accepted publickey for core from 139.178.68.195 port 42282 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:02.293547 sshd-session[1621]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:02.301068 systemd-logind[1488]: New session 2 of user core. Jan 13 21:49:02.308678 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 13 21:49:02.921505 sshd[1625]: Connection closed by 139.178.68.195 port 42282 Jan 13 21:49:02.920788 sshd-session[1621]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:02.925554 systemd-logind[1488]: Session 2 logged out. Waiting for processes to exit. Jan 13 21:49:02.926047 systemd[1]: sshd@1-10.230.9.94:22-139.178.68.195:42282.service: Deactivated successfully. Jan 13 21:49:02.928330 systemd[1]: session-2.scope: Deactivated successfully. Jan 13 21:49:02.930829 systemd-logind[1488]: Removed session 2. Jan 13 21:49:03.077574 systemd[1]: Started sshd@2-10.230.9.94:22-139.178.68.195:42286.service - OpenSSH per-connection server daemon (139.178.68.195:42286). Jan 13 21:49:03.994917 sshd[1630]: Accepted publickey for core from 139.178.68.195 port 42286 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:03.996938 sshd-session[1630]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:04.003691 systemd-logind[1488]: New session 3 of user core. Jan 13 21:49:04.015645 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 21:49:04.488164 agetty[1577]: failed to open credentials directory Jan 13 21:49:04.488539 agetty[1578]: failed to open credentials directory Jan 13 21:49:04.506118 login[1577]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 21:49:04.509966 login[1578]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 21:49:04.513435 systemd-logind[1488]: New session 5 of user core. Jan 13 21:49:04.522713 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 21:49:04.528423 systemd-logind[1488]: New session 4 of user core. Jan 13 21:49:04.532609 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 21:49:04.637028 sshd[1632]: Connection closed by 139.178.68.195 port 42286 Jan 13 21:49:04.638000 sshd-session[1630]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:04.641836 systemd[1]: sshd@2-10.230.9.94:22-139.178.68.195:42286.service: Deactivated successfully. Jan 13 21:49:04.644119 systemd[1]: session-3.scope: Deactivated successfully. Jan 13 21:49:04.646116 systemd-logind[1488]: Session 3 logged out. Waiting for processes to exit. Jan 13 21:49:04.647764 systemd-logind[1488]: Removed session 3. Jan 13 21:49:05.554526 coreos-metadata[1479]: Jan 13 21:49:05.554 WARN failed to locate config-drive, using the metadata service API instead Jan 13 21:49:05.581016 coreos-metadata[1479]: Jan 13 21:49:05.580 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 13 21:49:05.586829 coreos-metadata[1479]: Jan 13 21:49:05.586 INFO Fetch failed with 404: resource not found Jan 13 21:49:05.586829 coreos-metadata[1479]: Jan 13 21:49:05.586 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 13 21:49:05.587768 coreos-metadata[1479]: Jan 13 21:49:05.587 INFO Fetch successful Jan 13 21:49:05.587872 coreos-metadata[1479]: Jan 13 21:49:05.587 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 13 21:49:05.600075 coreos-metadata[1479]: Jan 13 21:49:05.600 INFO Fetch successful Jan 13 21:49:05.600075 coreos-metadata[1479]: Jan 13 21:49:05.600 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 13 21:49:05.614734 coreos-metadata[1479]: Jan 13 21:49:05.614 INFO Fetch successful Jan 13 21:49:05.614734 coreos-metadata[1479]: Jan 13 21:49:05.614 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 13 21:49:05.627664 coreos-metadata[1479]: Jan 13 21:49:05.627 INFO Fetch successful Jan 13 21:49:05.627664 coreos-metadata[1479]: Jan 13 21:49:05.627 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 13 21:49:05.644386 coreos-metadata[1479]: Jan 13 21:49:05.644 INFO Fetch successful Jan 13 21:49:05.679539 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 13 21:49:05.680713 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 21:49:06.050038 coreos-metadata[1544]: Jan 13 21:49:06.049 WARN failed to locate config-drive, using the metadata service API instead Jan 13 21:49:06.072500 coreos-metadata[1544]: Jan 13 21:49:06.072 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 13 21:49:06.101705 coreos-metadata[1544]: Jan 13 21:49:06.101 INFO Fetch successful Jan 13 21:49:06.101835 coreos-metadata[1544]: Jan 13 21:49:06.101 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 13 21:49:06.135607 coreos-metadata[1544]: Jan 13 21:49:06.135 INFO Fetch successful Jan 13 21:49:06.139022 unknown[1544]: wrote ssh authorized keys file for user: core Jan 13 21:49:06.157223 update-ssh-keys[1671]: Updated "/home/core/.ssh/authorized_keys" Jan 13 21:49:06.157877 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 13 21:49:06.161123 systemd[1]: Finished sshkeys.service. Jan 13 21:49:06.165192 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 21:49:06.165684 systemd[1]: Startup finished in 1.479s (kernel) + 14.914s (initrd) + 11.538s (userspace) = 27.932s. Jan 13 21:49:08.166794 systemd[1]: Started sshd@3-10.230.9.94:22-106.75.165.53:58430.service - OpenSSH per-connection server daemon (106.75.165.53:58430). Jan 13 21:49:11.466380 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 21:49:11.474590 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:49:11.619766 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:49:11.630824 (kubelet)[1685]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:49:11.678181 kubelet[1685]: E0113 21:49:11.678102 1685 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:49:11.681792 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:49:11.682038 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:49:14.897929 systemd[1]: Started sshd@4-10.230.9.94:22-139.178.68.195:60838.service - OpenSSH per-connection server daemon (139.178.68.195:60838). Jan 13 21:49:15.796770 sshd[1692]: Accepted publickey for core from 139.178.68.195 port 60838 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:15.798827 sshd-session[1692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:15.806982 systemd-logind[1488]: New session 6 of user core. Jan 13 21:49:15.816636 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 21:49:16.422087 sshd[1694]: Connection closed by 139.178.68.195 port 60838 Jan 13 21:49:16.421778 sshd-session[1692]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:16.426590 systemd[1]: sshd@4-10.230.9.94:22-139.178.68.195:60838.service: Deactivated successfully. Jan 13 21:49:16.429032 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 21:49:16.430765 systemd-logind[1488]: Session 6 logged out. Waiting for processes to exit. Jan 13 21:49:16.432302 systemd-logind[1488]: Removed session 6. Jan 13 21:49:16.589746 systemd[1]: Started sshd@5-10.230.9.94:22-139.178.68.195:60840.service - OpenSSH per-connection server daemon (139.178.68.195:60840). Jan 13 21:49:17.486097 sshd[1699]: Accepted publickey for core from 139.178.68.195 port 60840 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:17.488411 sshd-session[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:17.497615 systemd-logind[1488]: New session 7 of user core. Jan 13 21:49:17.507554 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 21:49:18.102859 sshd[1701]: Connection closed by 139.178.68.195 port 60840 Jan 13 21:49:18.104043 sshd-session[1699]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:18.109676 systemd[1]: sshd@5-10.230.9.94:22-139.178.68.195:60840.service: Deactivated successfully. Jan 13 21:49:18.111977 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 21:49:18.112905 systemd-logind[1488]: Session 7 logged out. Waiting for processes to exit. Jan 13 21:49:18.114675 systemd-logind[1488]: Removed session 7. Jan 13 21:49:18.270891 systemd[1]: Started sshd@6-10.230.9.94:22-139.178.68.195:60854.service - OpenSSH per-connection server daemon (139.178.68.195:60854). Jan 13 21:49:19.162273 sshd[1706]: Accepted publickey for core from 139.178.68.195 port 60854 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:19.164607 sshd-session[1706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:19.172669 systemd-logind[1488]: New session 8 of user core. Jan 13 21:49:19.184569 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 21:49:19.780939 sshd[1708]: Connection closed by 139.178.68.195 port 60854 Jan 13 21:49:19.781928 sshd-session[1706]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:19.786646 systemd[1]: sshd@6-10.230.9.94:22-139.178.68.195:60854.service: Deactivated successfully. Jan 13 21:49:19.788698 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 21:49:19.789603 systemd-logind[1488]: Session 8 logged out. Waiting for processes to exit. Jan 13 21:49:19.791280 systemd-logind[1488]: Removed session 8. Jan 13 21:49:19.936178 systemd[1]: Started sshd@7-10.230.9.94:22-139.178.68.195:60858.service - OpenSSH per-connection server daemon (139.178.68.195:60858). Jan 13 21:49:20.844007 sshd[1713]: Accepted publickey for core from 139.178.68.195 port 60858 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:20.845894 sshd-session[1713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:20.852420 systemd-logind[1488]: New session 9 of user core. Jan 13 21:49:20.868772 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 21:49:21.334631 sudo[1716]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 21:49:21.335153 sudo[1716]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:49:21.357625 sudo[1716]: pam_unix(sudo:session): session closed for user root Jan 13 21:49:21.501397 sshd[1715]: Connection closed by 139.178.68.195 port 60858 Jan 13 21:49:21.502456 sshd-session[1713]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:21.507247 systemd[1]: sshd@7-10.230.9.94:22-139.178.68.195:60858.service: Deactivated successfully. Jan 13 21:49:21.509319 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 21:49:21.510293 systemd-logind[1488]: Session 9 logged out. Waiting for processes to exit. Jan 13 21:49:21.512144 systemd-logind[1488]: Removed session 9. Jan 13 21:49:21.664785 systemd[1]: Started sshd@8-10.230.9.94:22-139.178.68.195:60862.service - OpenSSH per-connection server daemon (139.178.68.195:60862). Jan 13 21:49:21.715480 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 21:49:21.728626 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:49:21.906444 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:49:21.914157 (kubelet)[1731]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:49:21.982047 kubelet[1731]: E0113 21:49:21.981918 1731 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:49:21.984165 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:49:21.984450 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:49:22.561263 sshd[1721]: Accepted publickey for core from 139.178.68.195 port 60862 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:22.563853 sshd-session[1721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:22.572447 systemd-logind[1488]: New session 10 of user core. Jan 13 21:49:22.582607 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 21:49:23.039737 sudo[1741]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 21:49:23.040308 sudo[1741]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:49:23.046134 sudo[1741]: pam_unix(sudo:session): session closed for user root Jan 13 21:49:23.054735 sudo[1740]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 21:49:23.055216 sudo[1740]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:49:23.081924 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 21:49:23.122921 augenrules[1763]: No rules Jan 13 21:49:23.123977 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 21:49:23.124293 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 21:49:23.126633 sudo[1740]: pam_unix(sudo:session): session closed for user root Jan 13 21:49:23.270109 sshd[1739]: Connection closed by 139.178.68.195 port 60862 Jan 13 21:49:23.271235 sshd-session[1721]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:23.276309 systemd[1]: sshd@8-10.230.9.94:22-139.178.68.195:60862.service: Deactivated successfully. Jan 13 21:49:23.278980 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 21:49:23.280320 systemd-logind[1488]: Session 10 logged out. Waiting for processes to exit. Jan 13 21:49:23.282005 systemd-logind[1488]: Removed session 10. Jan 13 21:49:23.430666 systemd[1]: Started sshd@9-10.230.9.94:22-139.178.68.195:60872.service - OpenSSH per-connection server daemon (139.178.68.195:60872). Jan 13 21:49:24.331369 sshd[1771]: Accepted publickey for core from 139.178.68.195 port 60872 ssh2: RSA SHA256:vi3+TWEATqYSo33Ofj/1BspKyb7lh9OPFZdZH6pph+w Jan 13 21:49:24.333727 sshd-session[1771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:49:24.340750 systemd-logind[1488]: New session 11 of user core. Jan 13 21:49:24.349575 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 13 21:49:24.811712 sudo[1774]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 21:49:24.812243 sudo[1774]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:49:25.507077 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:49:25.519047 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:49:25.562567 systemd[1]: Reloading requested from client PID 1806 ('systemctl') (unit session-11.scope)... Jan 13 21:49:25.562603 systemd[1]: Reloading... Jan 13 21:49:25.726401 zram_generator::config[1847]: No configuration found. Jan 13 21:49:25.916613 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 21:49:26.034701 systemd[1]: Reloading finished in 471 ms. Jan 13 21:49:26.128962 (kubelet)[1906]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 21:49:26.129265 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:49:26.135343 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:49:26.136532 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 21:49:26.136851 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:49:26.145801 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:49:26.283894 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:49:26.297061 (kubelet)[1917]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 21:49:26.357462 kubelet[1917]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 21:49:26.357462 kubelet[1917]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 21:49:26.357462 kubelet[1917]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 21:49:26.358191 kubelet[1917]: I0113 21:49:26.357545 1917 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 21:49:26.829160 kubelet[1917]: I0113 21:49:26.829091 1917 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Jan 13 21:49:26.830353 kubelet[1917]: I0113 21:49:26.829313 1917 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 21:49:26.830353 kubelet[1917]: I0113 21:49:26.829734 1917 server.go:929] "Client rotation is on, will bootstrap in background" Jan 13 21:49:26.861693 kubelet[1917]: I0113 21:49:26.861634 1917 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 21:49:26.874060 kubelet[1917]: E0113 21:49:26.874001 1917 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 13 21:49:26.874720 kubelet[1917]: I0113 21:49:26.874698 1917 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 13 21:49:26.882153 kubelet[1917]: I0113 21:49:26.882115 1917 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 21:49:26.882503 kubelet[1917]: I0113 21:49:26.882478 1917 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 13 21:49:26.882899 kubelet[1917]: I0113 21:49:26.882852 1917 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 21:49:26.883319 kubelet[1917]: I0113 21:49:26.883015 1917 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"10.230.9.94","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 13 21:49:26.883673 kubelet[1917]: I0113 21:49:26.883650 1917 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 21:49:26.883771 kubelet[1917]: I0113 21:49:26.883754 1917 container_manager_linux.go:300] "Creating device plugin manager" Jan 13 21:49:26.884065 kubelet[1917]: I0113 21:49:26.884035 1917 state_mem.go:36] "Initialized new in-memory state store" Jan 13 21:49:26.885797 kubelet[1917]: I0113 21:49:26.885771 1917 kubelet.go:408] "Attempting to sync node with API server" Jan 13 21:49:26.886000 kubelet[1917]: I0113 21:49:26.885954 1917 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 21:49:26.886159 kubelet[1917]: I0113 21:49:26.886130 1917 kubelet.go:314] "Adding apiserver pod source" Jan 13 21:49:26.886308 kubelet[1917]: I0113 21:49:26.886287 1917 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 21:49:26.886617 kubelet[1917]: E0113 21:49:26.886539 1917 file.go:98] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:26.886674 kubelet[1917]: E0113 21:49:26.886647 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:26.892140 kubelet[1917]: I0113 21:49:26.892080 1917 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 21:49:26.895019 kubelet[1917]: W0113 21:49:26.894717 1917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Jan 13 21:49:26.895019 kubelet[1917]: E0113 21:49:26.894802 1917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Jan 13 21:49:26.895019 kubelet[1917]: W0113 21:49:26.894934 1917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "10.230.9.94" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Jan 13 21:49:26.895019 kubelet[1917]: E0113 21:49:26.894960 1917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"10.230.9.94\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Jan 13 21:49:26.895285 kubelet[1917]: I0113 21:49:26.895088 1917 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 21:49:26.895285 kubelet[1917]: W0113 21:49:26.895234 1917 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 21:49:26.896405 kubelet[1917]: I0113 21:49:26.896372 1917 server.go:1269] "Started kubelet" Jan 13 21:49:26.898356 kubelet[1917]: I0113 21:49:26.897425 1917 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 21:49:26.899318 kubelet[1917]: I0113 21:49:26.899234 1917 server.go:460] "Adding debug handlers to kubelet server" Jan 13 21:49:26.904665 kubelet[1917]: I0113 21:49:26.903816 1917 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 21:49:26.904665 kubelet[1917]: I0113 21:49:26.904254 1917 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 21:49:26.905438 kubelet[1917]: I0113 21:49:26.904994 1917 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 21:49:26.905438 kubelet[1917]: I0113 21:49:26.905229 1917 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 13 21:49:26.912772 kubelet[1917]: E0113 21:49:26.912747 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:26.912960 kubelet[1917]: I0113 21:49:26.912940 1917 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 13 21:49:26.913317 kubelet[1917]: I0113 21:49:26.913294 1917 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 13 21:49:26.913541 kubelet[1917]: I0113 21:49:26.913521 1917 reconciler.go:26] "Reconciler: start to sync state" Jan 13 21:49:26.915434 kubelet[1917]: I0113 21:49:26.915411 1917 factory.go:221] Registration of the systemd container factory successfully Jan 13 21:49:26.915664 kubelet[1917]: I0113 21:49:26.915636 1917 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 21:49:26.917555 kubelet[1917]: E0113 21:49:26.917300 1917 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 21:49:26.922802 kubelet[1917]: I0113 21:49:26.921403 1917 factory.go:221] Registration of the containerd container factory successfully Jan 13 21:49:26.930920 kubelet[1917]: E0113 21:49:26.930871 1917 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"10.230.9.94\" not found" node="10.230.9.94" Jan 13 21:49:26.953269 kubelet[1917]: I0113 21:49:26.953217 1917 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 21:49:26.953483 kubelet[1917]: I0113 21:49:26.953457 1917 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 21:49:26.953632 kubelet[1917]: I0113 21:49:26.953613 1917 state_mem.go:36] "Initialized new in-memory state store" Jan 13 21:49:26.957374 kubelet[1917]: I0113 21:49:26.957288 1917 policy_none.go:49] "None policy: Start" Jan 13 21:49:26.959401 kubelet[1917]: I0113 21:49:26.959362 1917 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 21:49:26.959564 kubelet[1917]: I0113 21:49:26.959516 1917 state_mem.go:35] "Initializing new in-memory state store" Jan 13 21:49:26.981159 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 13 21:49:26.995388 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 13 21:49:27.009572 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 13 21:49:27.012375 kubelet[1917]: I0113 21:49:27.011569 1917 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 21:49:27.012375 kubelet[1917]: I0113 21:49:27.011951 1917 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 13 21:49:27.012375 kubelet[1917]: I0113 21:49:27.011998 1917 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 13 21:49:27.016776 kubelet[1917]: I0113 21:49:27.014561 1917 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 21:49:27.020595 kubelet[1917]: E0113 21:49:27.020564 1917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"10.230.9.94\" not found" Jan 13 21:49:27.024095 kubelet[1917]: I0113 21:49:27.024042 1917 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 21:49:27.026648 kubelet[1917]: I0113 21:49:27.026622 1917 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 21:49:27.026802 kubelet[1917]: I0113 21:49:27.026782 1917 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 21:49:27.026954 kubelet[1917]: I0113 21:49:27.026932 1917 kubelet.go:2321] "Starting kubelet main sync loop" Jan 13 21:49:27.030382 kubelet[1917]: E0113 21:49:27.029813 1917 kubelet.go:2345] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Jan 13 21:49:27.114155 kubelet[1917]: I0113 21:49:27.113950 1917 kubelet_node_status.go:72] "Attempting to register node" node="10.230.9.94" Jan 13 21:49:27.125491 kubelet[1917]: I0113 21:49:27.125459 1917 kubelet_node_status.go:75] "Successfully registered node" node="10.230.9.94" Jan 13 21:49:27.125591 kubelet[1917]: E0113 21:49:27.125499 1917 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"10.230.9.94\": node \"10.230.9.94\" not found" Jan 13 21:49:27.142448 kubelet[1917]: E0113 21:49:27.142411 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.243419 kubelet[1917]: E0113 21:49:27.243293 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.326184 sudo[1774]: pam_unix(sudo:session): session closed for user root Jan 13 21:49:27.343943 kubelet[1917]: E0113 21:49:27.343887 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.444700 kubelet[1917]: E0113 21:49:27.444073 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.469985 sshd[1773]: Connection closed by 139.178.68.195 port 60872 Jan 13 21:49:27.470869 sshd-session[1771]: pam_unix(sshd:session): session closed for user core Jan 13 21:49:27.476150 systemd[1]: sshd@9-10.230.9.94:22-139.178.68.195:60872.service: Deactivated successfully. Jan 13 21:49:27.478829 systemd[1]: session-11.scope: Deactivated successfully. Jan 13 21:49:27.481264 systemd-logind[1488]: Session 11 logged out. Waiting for processes to exit. Jan 13 21:49:27.483887 systemd-logind[1488]: Removed session 11. Jan 13 21:49:27.545016 kubelet[1917]: E0113 21:49:27.544939 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.645980 kubelet[1917]: E0113 21:49:27.645881 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.746732 kubelet[1917]: E0113 21:49:27.746645 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.832815 kubelet[1917]: I0113 21:49:27.832708 1917 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 13 21:49:27.833178 kubelet[1917]: W0113 21:49:27.833144 1917 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 13 21:49:27.833251 kubelet[1917]: W0113 21:49:27.833148 1917 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 13 21:49:27.847444 kubelet[1917]: E0113 21:49:27.847399 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:27.887063 kubelet[1917]: E0113 21:49:27.886933 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:27.948135 kubelet[1917]: E0113 21:49:27.948069 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:28.048992 kubelet[1917]: E0113 21:49:28.048785 1917 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"10.230.9.94\" not found" Jan 13 21:49:28.151024 kubelet[1917]: I0113 21:49:28.150962 1917 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.1.0/24" Jan 13 21:49:28.151610 containerd[1511]: time="2025-01-13T21:49:28.151494199Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 21:49:28.152823 kubelet[1917]: I0113 21:49:28.151861 1917 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.1.0/24" Jan 13 21:49:28.887787 kubelet[1917]: E0113 21:49:28.887716 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:28.887787 kubelet[1917]: I0113 21:49:28.887749 1917 apiserver.go:52] "Watching apiserver" Jan 13 21:49:28.895513 kubelet[1917]: E0113 21:49:28.894862 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:28.908116 systemd[1]: Created slice kubepods-besteffort-podd2569095_ef7f_4083_aae6_aee41ca302f9.slice - libcontainer container kubepods-besteffort-podd2569095_ef7f_4083_aae6_aee41ca302f9.slice. Jan 13 21:49:28.914833 kubelet[1917]: I0113 21:49:28.914776 1917 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 13 21:49:28.923859 systemd[1]: Created slice kubepods-besteffort-pod7c865249_4a3a_467d_8228_f83299031940.slice - libcontainer container kubepods-besteffort-pod7c865249_4a3a_467d_8228_f83299031940.slice. Jan 13 21:49:28.928246 kubelet[1917]: I0113 21:49:28.928198 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d2569095-ef7f-4083-aae6-aee41ca302f9-xtables-lock\") pod \"kube-proxy-qlq2z\" (UID: \"d2569095-ef7f-4083-aae6-aee41ca302f9\") " pod="kube-system/kube-proxy-qlq2z" Jan 13 21:49:28.928429 kubelet[1917]: I0113 21:49:28.928405 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/7c865249-4a3a-467d-8228-f83299031940-node-certs\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.928564 kubelet[1917]: I0113 21:49:28.928541 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-var-lib-calico\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.928707 kubelet[1917]: I0113 21:49:28.928668 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-flexvol-driver-host\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.928869 kubelet[1917]: I0113 21:49:28.928843 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7c78\" (UniqueName: \"kubernetes.io/projected/ad5321ad-96ea-4705-87ad-19d987280dbc-kube-api-access-m7c78\") pod \"csi-node-driver-dwq7q\" (UID: \"ad5321ad-96ea-4705-87ad-19d987280dbc\") " pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:28.929011 kubelet[1917]: I0113 21:49:28.928986 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d2569095-ef7f-4083-aae6-aee41ca302f9-kube-proxy\") pod \"kube-proxy-qlq2z\" (UID: \"d2569095-ef7f-4083-aae6-aee41ca302f9\") " pod="kube-system/kube-proxy-qlq2z" Jan 13 21:49:28.929128 kubelet[1917]: I0113 21:49:28.929104 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ad5321ad-96ea-4705-87ad-19d987280dbc-kubelet-dir\") pod \"csi-node-driver-dwq7q\" (UID: \"ad5321ad-96ea-4705-87ad-19d987280dbc\") " pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:28.929259 kubelet[1917]: I0113 21:49:28.929226 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ad5321ad-96ea-4705-87ad-19d987280dbc-socket-dir\") pod \"csi-node-driver-dwq7q\" (UID: \"ad5321ad-96ea-4705-87ad-19d987280dbc\") " pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:28.929412 kubelet[1917]: I0113 21:49:28.929387 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-cni-net-dir\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.929566 kubelet[1917]: I0113 21:49:28.929541 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-lib-modules\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.929685 kubelet[1917]: I0113 21:49:28.929662 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-xtables-lock\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.929825 kubelet[1917]: I0113 21:49:28.929802 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-policysync\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.929952 kubelet[1917]: I0113 21:49:28.929928 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bfz5\" (UniqueName: \"kubernetes.io/projected/7c865249-4a3a-467d-8228-f83299031940-kube-api-access-4bfz5\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.931353 kubelet[1917]: I0113 21:49:28.930067 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ad5321ad-96ea-4705-87ad-19d987280dbc-varrun\") pod \"csi-node-driver-dwq7q\" (UID: \"ad5321ad-96ea-4705-87ad-19d987280dbc\") " pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:28.931353 kubelet[1917]: I0113 21:49:28.930110 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ad5321ad-96ea-4705-87ad-19d987280dbc-registration-dir\") pod \"csi-node-driver-dwq7q\" (UID: \"ad5321ad-96ea-4705-87ad-19d987280dbc\") " pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:28.931353 kubelet[1917]: I0113 21:49:28.930140 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc7cr\" (UniqueName: \"kubernetes.io/projected/d2569095-ef7f-4083-aae6-aee41ca302f9-kube-api-access-lc7cr\") pod \"kube-proxy-qlq2z\" (UID: \"d2569095-ef7f-4083-aae6-aee41ca302f9\") " pod="kube-system/kube-proxy-qlq2z" Jan 13 21:49:28.931353 kubelet[1917]: I0113 21:49:28.930176 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c865249-4a3a-467d-8228-f83299031940-tigera-ca-bundle\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.931353 kubelet[1917]: I0113 21:49:28.930211 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-var-run-calico\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.931594 kubelet[1917]: I0113 21:49:28.930244 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-cni-bin-dir\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.931594 kubelet[1917]: I0113 21:49:28.930274 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/7c865249-4a3a-467d-8228-f83299031940-cni-log-dir\") pod \"calico-node-tps9v\" (UID: \"7c865249-4a3a-467d-8228-f83299031940\") " pod="calico-system/calico-node-tps9v" Jan 13 21:49:28.931594 kubelet[1917]: I0113 21:49:28.930312 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2569095-ef7f-4083-aae6-aee41ca302f9-lib-modules\") pod \"kube-proxy-qlq2z\" (UID: \"d2569095-ef7f-4083-aae6-aee41ca302f9\") " pod="kube-system/kube-proxy-qlq2z" Jan 13 21:49:29.039493 kubelet[1917]: E0113 21:49:29.039443 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:29.039714 kubelet[1917]: W0113 21:49:29.039689 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:29.039862 kubelet[1917]: E0113 21:49:29.039836 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:29.062469 kubelet[1917]: E0113 21:49:29.062432 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:29.062469 kubelet[1917]: W0113 21:49:29.062463 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:29.064863 kubelet[1917]: E0113 21:49:29.064783 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:29.064863 kubelet[1917]: W0113 21:49:29.064816 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:29.065012 kubelet[1917]: E0113 21:49:29.064887 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:29.065012 kubelet[1917]: E0113 21:49:29.062846 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:29.068560 kubelet[1917]: E0113 21:49:29.068538 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:29.068742 kubelet[1917]: W0113 21:49:29.068719 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:29.068895 kubelet[1917]: E0113 21:49:29.068872 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:29.221339 containerd[1511]: time="2025-01-13T21:49:29.221274665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qlq2z,Uid:d2569095-ef7f-4083-aae6-aee41ca302f9,Namespace:kube-system,Attempt:0,}" Jan 13 21:49:29.232403 containerd[1511]: time="2025-01-13T21:49:29.232053694Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tps9v,Uid:7c865249-4a3a-467d-8228-f83299031940,Namespace:calico-system,Attempt:0,}" Jan 13 21:49:29.888499 kubelet[1917]: E0113 21:49:29.888432 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:30.078000 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount582825226.mount: Deactivated successfully. Jan 13 21:49:30.086061 containerd[1511]: time="2025-01-13T21:49:30.085996365Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:49:30.088120 containerd[1511]: time="2025-01-13T21:49:30.088029433Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:49:30.090834 containerd[1511]: time="2025-01-13T21:49:30.090760484Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 21:49:30.090941 containerd[1511]: time="2025-01-13T21:49:30.090858565Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:49:30.091307 containerd[1511]: time="2025-01-13T21:49:30.091251788Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Jan 13 21:49:30.093747 containerd[1511]: time="2025-01-13T21:49:30.093685177Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:49:30.097397 containerd[1511]: time="2025-01-13T21:49:30.097358112Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 875.800495ms" Jan 13 21:49:30.099696 containerd[1511]: time="2025-01-13T21:49:30.099432364Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 867.269633ms" Jan 13 21:49:30.228398 containerd[1511]: time="2025-01-13T21:49:30.228235002Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:49:30.229260 containerd[1511]: time="2025-01-13T21:49:30.228383905Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:49:30.229260 containerd[1511]: time="2025-01-13T21:49:30.228421290Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:30.229260 containerd[1511]: time="2025-01-13T21:49:30.228556428Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:30.231875 containerd[1511]: time="2025-01-13T21:49:30.231409873Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:49:30.232020 containerd[1511]: time="2025-01-13T21:49:30.231924391Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:49:30.235729 containerd[1511]: time="2025-01-13T21:49:30.231960673Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:30.235729 containerd[1511]: time="2025-01-13T21:49:30.234653568Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:30.334637 systemd[1]: Started cri-containerd-1b4866010193f5a71379d5ab40a300245ae332aa41a09576f83e82c5800e3fec.scope - libcontainer container 1b4866010193f5a71379d5ab40a300245ae332aa41a09576f83e82c5800e3fec. Jan 13 21:49:30.340480 systemd[1]: Started cri-containerd-c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101.scope - libcontainer container c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101. Jan 13 21:49:30.384631 containerd[1511]: time="2025-01-13T21:49:30.384549224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-tps9v,Uid:7c865249-4a3a-467d-8228-f83299031940,Namespace:calico-system,Attempt:0,} returns sandbox id \"c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101\"" Jan 13 21:49:30.386281 containerd[1511]: time="2025-01-13T21:49:30.386249389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qlq2z,Uid:d2569095-ef7f-4083-aae6-aee41ca302f9,Namespace:kube-system,Attempt:0,} returns sandbox id \"1b4866010193f5a71379d5ab40a300245ae332aa41a09576f83e82c5800e3fec\"" Jan 13 21:49:30.390782 containerd[1511]: time="2025-01-13T21:49:30.390750985Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.4\"" Jan 13 21:49:30.889578 kubelet[1917]: E0113 21:49:30.889529 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:30.999309 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Jan 13 21:49:31.029440 kubelet[1917]: E0113 21:49:31.028646 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:31.890760 kubelet[1917]: E0113 21:49:31.890670 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:31.933109 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount770537829.mount: Deactivated successfully. Jan 13 21:49:32.653565 containerd[1511]: time="2025-01-13T21:49:32.653493758Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:32.654888 containerd[1511]: time="2025-01-13T21:49:32.654840169Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.4: active requests=0, bytes read=30230251" Jan 13 21:49:32.655896 containerd[1511]: time="2025-01-13T21:49:32.655837467Z" level=info msg="ImageCreate event name:\"sha256:ebf80573666f86f115452db568feb34f6f771c3bdc7bfed14b9577f992cfa300\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:32.658800 containerd[1511]: time="2025-01-13T21:49:32.658743313Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:1739b3febca392035bf6edfe31efdfa55226be7b57389b2001ae357f7dcb99cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:32.660109 containerd[1511]: time="2025-01-13T21:49:32.659885877Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.4\" with image id \"sha256:ebf80573666f86f115452db568feb34f6f771c3bdc7bfed14b9577f992cfa300\", repo tag \"registry.k8s.io/kube-proxy:v1.31.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:1739b3febca392035bf6edfe31efdfa55226be7b57389b2001ae357f7dcb99cf\", size \"30229262\" in 2.269094412s" Jan 13 21:49:32.660109 containerd[1511]: time="2025-01-13T21:49:32.659928245Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.4\" returns image reference \"sha256:ebf80573666f86f115452db568feb34f6f771c3bdc7bfed14b9577f992cfa300\"" Jan 13 21:49:32.662903 containerd[1511]: time="2025-01-13T21:49:32.662440168Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 13 21:49:32.663978 containerd[1511]: time="2025-01-13T21:49:32.663940818Z" level=info msg="CreateContainer within sandbox \"1b4866010193f5a71379d5ab40a300245ae332aa41a09576f83e82c5800e3fec\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 21:49:32.681124 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3007585837.mount: Deactivated successfully. Jan 13 21:49:32.686895 containerd[1511]: time="2025-01-13T21:49:32.686858377Z" level=info msg="CreateContainer within sandbox \"1b4866010193f5a71379d5ab40a300245ae332aa41a09576f83e82c5800e3fec\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"cca32a37c75739e28edba1c25d7d245914a20e95f0f714dc51804d8ae57a72f8\"" Jan 13 21:49:32.687621 containerd[1511]: time="2025-01-13T21:49:32.687581931Z" level=info msg="StartContainer for \"cca32a37c75739e28edba1c25d7d245914a20e95f0f714dc51804d8ae57a72f8\"" Jan 13 21:49:32.732933 systemd[1]: Started cri-containerd-cca32a37c75739e28edba1c25d7d245914a20e95f0f714dc51804d8ae57a72f8.scope - libcontainer container cca32a37c75739e28edba1c25d7d245914a20e95f0f714dc51804d8ae57a72f8. Jan 13 21:49:32.776179 containerd[1511]: time="2025-01-13T21:49:32.776106028Z" level=info msg="StartContainer for \"cca32a37c75739e28edba1c25d7d245914a20e95f0f714dc51804d8ae57a72f8\" returns successfully" Jan 13 21:49:32.891821 kubelet[1917]: E0113 21:49:32.891744 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:33.029207 kubelet[1917]: E0113 21:49:33.028707 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:33.069165 kubelet[1917]: I0113 21:49:33.069089 1917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qlq2z" podStartSLOduration=3.797679546 podStartE2EDuration="6.069060808s" podCreationTimestamp="2025-01-13 21:49:27 +0000 UTC" firstStartedPulling="2025-01-13 21:49:30.389975391 +0000 UTC m=+4.086220369" lastFinishedPulling="2025-01-13 21:49:32.661356655 +0000 UTC m=+6.357601631" observedRunningTime="2025-01-13 21:49:33.068565701 +0000 UTC m=+6.764810690" watchObservedRunningTime="2025-01-13 21:49:33.069060808 +0000 UTC m=+6.765305812" Jan 13 21:49:33.142436 kubelet[1917]: E0113 21:49:33.142382 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.142436 kubelet[1917]: W0113 21:49:33.142412 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.142436 kubelet[1917]: E0113 21:49:33.142442 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.142718 kubelet[1917]: E0113 21:49:33.142706 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.142770 kubelet[1917]: W0113 21:49:33.142721 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.142770 kubelet[1917]: E0113 21:49:33.142736 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.143680 kubelet[1917]: E0113 21:49:33.143004 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.143680 kubelet[1917]: W0113 21:49:33.143024 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.143680 kubelet[1917]: E0113 21:49:33.143040 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.143680 kubelet[1917]: E0113 21:49:33.143300 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.143680 kubelet[1917]: W0113 21:49:33.143314 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.143680 kubelet[1917]: E0113 21:49:33.143349 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.143680 kubelet[1917]: E0113 21:49:33.143618 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.143680 kubelet[1917]: W0113 21:49:33.143631 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.143680 kubelet[1917]: E0113 21:49:33.143645 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.144311 kubelet[1917]: E0113 21:49:33.143896 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.144311 kubelet[1917]: W0113 21:49:33.143912 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.144311 kubelet[1917]: E0113 21:49:33.143926 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.144311 kubelet[1917]: E0113 21:49:33.144187 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.144311 kubelet[1917]: W0113 21:49:33.144200 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.144311 kubelet[1917]: E0113 21:49:33.144214 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146119 kubelet[1917]: E0113 21:49:33.144472 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146119 kubelet[1917]: W0113 21:49:33.144485 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146119 kubelet[1917]: E0113 21:49:33.144498 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146119 kubelet[1917]: E0113 21:49:33.144753 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146119 kubelet[1917]: W0113 21:49:33.144767 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146119 kubelet[1917]: E0113 21:49:33.144781 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146119 kubelet[1917]: E0113 21:49:33.145055 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146119 kubelet[1917]: W0113 21:49:33.145070 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146119 kubelet[1917]: E0113 21:49:33.145085 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146119 kubelet[1917]: E0113 21:49:33.145384 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146500 kubelet[1917]: W0113 21:49:33.145399 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146500 kubelet[1917]: E0113 21:49:33.145413 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146500 kubelet[1917]: E0113 21:49:33.145706 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146500 kubelet[1917]: W0113 21:49:33.145719 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146500 kubelet[1917]: E0113 21:49:33.145734 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146500 kubelet[1917]: E0113 21:49:33.146270 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146500 kubelet[1917]: W0113 21:49:33.146284 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146500 kubelet[1917]: E0113 21:49:33.146298 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146787 kubelet[1917]: E0113 21:49:33.146576 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146787 kubelet[1917]: W0113 21:49:33.146589 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146787 kubelet[1917]: E0113 21:49:33.146603 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.146909 kubelet[1917]: E0113 21:49:33.146862 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.146909 kubelet[1917]: W0113 21:49:33.146875 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.146909 kubelet[1917]: E0113 21:49:33.146889 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.147319 kubelet[1917]: E0113 21:49:33.147157 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.147319 kubelet[1917]: W0113 21:49:33.147180 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.147319 kubelet[1917]: E0113 21:49:33.147196 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.148075 kubelet[1917]: E0113 21:49:33.148044 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.148349 kubelet[1917]: W0113 21:49:33.148171 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.148349 kubelet[1917]: E0113 21:49:33.148198 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.148600 kubelet[1917]: E0113 21:49:33.148582 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.148774 kubelet[1917]: W0113 21:49:33.148663 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.148774 kubelet[1917]: E0113 21:49:33.148683 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.149284 kubelet[1917]: E0113 21:49:33.149136 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.149284 kubelet[1917]: W0113 21:49:33.149155 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.149284 kubelet[1917]: E0113 21:49:33.149171 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.149732 kubelet[1917]: E0113 21:49:33.149637 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.149732 kubelet[1917]: W0113 21:49:33.149655 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.149732 kubelet[1917]: E0113 21:49:33.149671 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.152822 kubelet[1917]: E0113 21:49:33.152779 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.152822 kubelet[1917]: W0113 21:49:33.152807 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.153050 kubelet[1917]: E0113 21:49:33.152825 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.153145 kubelet[1917]: E0113 21:49:33.153122 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.153145 kubelet[1917]: W0113 21:49:33.153135 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.153308 kubelet[1917]: E0113 21:49:33.153173 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.153537 kubelet[1917]: E0113 21:49:33.153516 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.153646 kubelet[1917]: W0113 21:49:33.153537 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.153646 kubelet[1917]: E0113 21:49:33.153574 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.153863 kubelet[1917]: E0113 21:49:33.153843 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.153928 kubelet[1917]: W0113 21:49:33.153864 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.153928 kubelet[1917]: E0113 21:49:33.153887 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.154168 kubelet[1917]: E0113 21:49:33.154149 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.154168 kubelet[1917]: W0113 21:49:33.154167 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.154313 kubelet[1917]: E0113 21:49:33.154190 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.154518 kubelet[1917]: E0113 21:49:33.154498 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.154574 kubelet[1917]: W0113 21:49:33.154520 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.154574 kubelet[1917]: E0113 21:49:33.154543 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.156507 kubelet[1917]: E0113 21:49:33.155430 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.156507 kubelet[1917]: W0113 21:49:33.155451 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.156507 kubelet[1917]: E0113 21:49:33.155474 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.156507 kubelet[1917]: E0113 21:49:33.155761 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.156507 kubelet[1917]: W0113 21:49:33.155774 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.156507 kubelet[1917]: E0113 21:49:33.155805 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.156507 kubelet[1917]: E0113 21:49:33.156119 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.156507 kubelet[1917]: W0113 21:49:33.156134 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.156507 kubelet[1917]: E0113 21:49:33.156252 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.156878 kubelet[1917]: E0113 21:49:33.156551 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.156878 kubelet[1917]: W0113 21:49:33.156564 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.156878 kubelet[1917]: E0113 21:49:33.156585 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.157774 kubelet[1917]: E0113 21:49:33.157745 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.157774 kubelet[1917]: W0113 21:49:33.157765 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.157887 kubelet[1917]: E0113 21:49:33.157780 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.158304 kubelet[1917]: E0113 21:49:33.158271 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:33.158304 kubelet[1917]: W0113 21:49:33.158291 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:33.158418 kubelet[1917]: E0113 21:49:33.158307 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:33.892470 kubelet[1917]: E0113 21:49:33.892406 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:33.993403 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2906849075.mount: Deactivated successfully. Jan 13 21:49:34.129528 containerd[1511]: time="2025-01-13T21:49:34.129467865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:34.130718 containerd[1511]: time="2025-01-13T21:49:34.130670880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=6855343" Jan 13 21:49:34.131967 containerd[1511]: time="2025-01-13T21:49:34.131926061Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:34.134884 containerd[1511]: time="2025-01-13T21:49:34.134848031Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:34.136043 containerd[1511]: time="2025-01-13T21:49:34.135999948Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.47351127s" Jan 13 21:49:34.136123 containerd[1511]: time="2025-01-13T21:49:34.136045733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 13 21:49:34.139666 containerd[1511]: time="2025-01-13T21:49:34.139518469Z" level=info msg="CreateContainer within sandbox \"c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 21:49:34.156021 kubelet[1917]: E0113 21:49:34.155526 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.156202 kubelet[1917]: W0113 21:49:34.156168 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.156305 kubelet[1917]: E0113 21:49:34.156213 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.156540 kubelet[1917]: E0113 21:49:34.156520 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.156540 kubelet[1917]: W0113 21:49:34.156539 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.156701 kubelet[1917]: E0113 21:49:34.156555 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.156823 kubelet[1917]: E0113 21:49:34.156798 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.156885 kubelet[1917]: W0113 21:49:34.156823 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.156885 kubelet[1917]: E0113 21:49:34.156839 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.157097 kubelet[1917]: E0113 21:49:34.157078 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.157097 kubelet[1917]: W0113 21:49:34.157096 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.157245 kubelet[1917]: E0113 21:49:34.157111 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.157476 kubelet[1917]: E0113 21:49:34.157465 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.157534 kubelet[1917]: W0113 21:49:34.157478 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.157534 kubelet[1917]: E0113 21:49:34.157493 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.157752 kubelet[1917]: E0113 21:49:34.157733 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.157752 kubelet[1917]: W0113 21:49:34.157751 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.157892 kubelet[1917]: E0113 21:49:34.157765 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.158017 kubelet[1917]: E0113 21:49:34.157998 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.158017 kubelet[1917]: W0113 21:49:34.158016 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.158162 kubelet[1917]: E0113 21:49:34.158030 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.158305 kubelet[1917]: E0113 21:49:34.158286 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.158305 kubelet[1917]: W0113 21:49:34.158305 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.158482 kubelet[1917]: E0113 21:49:34.158319 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.158611 kubelet[1917]: E0113 21:49:34.158592 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.158611 kubelet[1917]: W0113 21:49:34.158611 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.158729 kubelet[1917]: E0113 21:49:34.158626 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.158876 kubelet[1917]: E0113 21:49:34.158858 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.158876 kubelet[1917]: W0113 21:49:34.158875 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.158984 kubelet[1917]: E0113 21:49:34.158890 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.159138 kubelet[1917]: E0113 21:49:34.159119 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.159138 kubelet[1917]: W0113 21:49:34.159137 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.159271 kubelet[1917]: E0113 21:49:34.159152 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.159431 kubelet[1917]: E0113 21:49:34.159411 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.159431 kubelet[1917]: W0113 21:49:34.159425 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.159749 kubelet[1917]: E0113 21:49:34.159439 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.159811 kubelet[1917]: E0113 21:49:34.159770 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.159811 kubelet[1917]: W0113 21:49:34.159792 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.159811 kubelet[1917]: E0113 21:49:34.159806 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.160080 kubelet[1917]: E0113 21:49:34.160062 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.160080 kubelet[1917]: W0113 21:49:34.160079 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.160265 kubelet[1917]: E0113 21:49:34.160093 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.160392 kubelet[1917]: E0113 21:49:34.160373 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.160477 kubelet[1917]: W0113 21:49:34.160392 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.160477 kubelet[1917]: E0113 21:49:34.160437 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.160712 kubelet[1917]: E0113 21:49:34.160695 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.160830 kubelet[1917]: W0113 21:49:34.160713 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.160830 kubelet[1917]: E0113 21:49:34.160728 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.161112 kubelet[1917]: E0113 21:49:34.161094 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.161164 kubelet[1917]: W0113 21:49:34.161115 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.161164 kubelet[1917]: E0113 21:49:34.161137 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.161707 kubelet[1917]: E0113 21:49:34.161657 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.161707 kubelet[1917]: W0113 21:49:34.161699 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.161839 kubelet[1917]: E0113 21:49:34.161715 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.162636 kubelet[1917]: E0113 21:49:34.162609 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.162636 kubelet[1917]: W0113 21:49:34.162630 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.162821 kubelet[1917]: E0113 21:49:34.162645 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.162914 kubelet[1917]: E0113 21:49:34.162881 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.162914 kubelet[1917]: W0113 21:49:34.162895 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.162914 kubelet[1917]: E0113 21:49:34.162910 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.163331 kubelet[1917]: E0113 21:49:34.163242 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.163331 kubelet[1917]: W0113 21:49:34.163255 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.163331 kubelet[1917]: E0113 21:49:34.163270 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.163733 kubelet[1917]: E0113 21:49:34.163712 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.163733 kubelet[1917]: W0113 21:49:34.163731 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.163898 kubelet[1917]: E0113 21:49:34.163754 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.164574 kubelet[1917]: E0113 21:49:34.164553 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.164574 kubelet[1917]: W0113 21:49:34.164573 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.164674 kubelet[1917]: E0113 21:49:34.164596 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.164879 kubelet[1917]: E0113 21:49:34.164864 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.165046 kubelet[1917]: W0113 21:49:34.164886 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.165046 kubelet[1917]: E0113 21:49:34.164972 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.165408 kubelet[1917]: E0113 21:49:34.165383 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.165494 kubelet[1917]: W0113 21:49:34.165420 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.165494 kubelet[1917]: E0113 21:49:34.165448 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.166003 kubelet[1917]: E0113 21:49:34.165971 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.166003 kubelet[1917]: W0113 21:49:34.166000 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.166139 kubelet[1917]: E0113 21:49:34.166023 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.166392 kubelet[1917]: E0113 21:49:34.166372 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.167109 kubelet[1917]: W0113 21:49:34.166395 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.167109 kubelet[1917]: E0113 21:49:34.166461 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.167109 kubelet[1917]: E0113 21:49:34.166663 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.167109 kubelet[1917]: W0113 21:49:34.166676 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.167109 kubelet[1917]: E0113 21:49:34.166710 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.167796 kubelet[1917]: E0113 21:49:34.166999 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.167796 kubelet[1917]: W0113 21:49:34.167174 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.167796 kubelet[1917]: E0113 21:49:34.167211 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.167796 kubelet[1917]: E0113 21:49:34.167593 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.167796 kubelet[1917]: W0113 21:49:34.167607 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.167796 kubelet[1917]: E0113 21:49:34.167621 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.171522 kubelet[1917]: E0113 21:49:34.169704 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.171522 kubelet[1917]: W0113 21:49:34.169724 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.171522 kubelet[1917]: E0113 21:49:34.171371 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.172211 kubelet[1917]: E0113 21:49:34.172094 1917 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:49:34.172211 kubelet[1917]: W0113 21:49:34.172110 1917 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:49:34.172211 kubelet[1917]: E0113 21:49:34.172126 1917 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:49:34.177474 containerd[1511]: time="2025-01-13T21:49:34.177431823Z" level=info msg="CreateContainer within sandbox \"c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7\"" Jan 13 21:49:34.178363 containerd[1511]: time="2025-01-13T21:49:34.178268141Z" level=info msg="StartContainer for \"cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7\"" Jan 13 21:49:34.220544 systemd[1]: Started cri-containerd-cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7.scope - libcontainer container cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7. Jan 13 21:49:34.262376 containerd[1511]: time="2025-01-13T21:49:34.260481147Z" level=info msg="StartContainer for \"cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7\" returns successfully" Jan 13 21:49:34.279089 systemd[1]: cri-containerd-cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7.scope: Deactivated successfully. Jan 13 21:49:34.588887 containerd[1511]: time="2025-01-13T21:49:34.588725909Z" level=info msg="shim disconnected" id=cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7 namespace=k8s.io Jan 13 21:49:34.588887 containerd[1511]: time="2025-01-13T21:49:34.588817471Z" level=warning msg="cleaning up after shim disconnected" id=cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7 namespace=k8s.io Jan 13 21:49:34.588887 containerd[1511]: time="2025-01-13T21:49:34.588835153Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 21:49:34.892983 kubelet[1917]: E0113 21:49:34.892741 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:34.936508 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cd252109f9ff4ac47934725b7b571c9dd0f4778f1c2979825027173bfebe66b7-rootfs.mount: Deactivated successfully. Jan 13 21:49:35.028664 kubelet[1917]: E0113 21:49:35.028191 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:35.060369 containerd[1511]: time="2025-01-13T21:49:35.060251897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 13 21:49:35.893359 kubelet[1917]: E0113 21:49:35.893254 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:36.893902 kubelet[1917]: E0113 21:49:36.893717 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:37.027764 kubelet[1917]: E0113 21:49:37.027667 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:37.894557 kubelet[1917]: E0113 21:49:37.894508 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:38.895518 kubelet[1917]: E0113 21:49:38.895476 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:39.030602 kubelet[1917]: E0113 21:49:39.030092 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:39.895939 kubelet[1917]: E0113 21:49:39.895838 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:40.310240 containerd[1511]: time="2025-01-13T21:49:40.309168165Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:40.311846 containerd[1511]: time="2025-01-13T21:49:40.311805241Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 13 21:49:40.312694 containerd[1511]: time="2025-01-13T21:49:40.312651574Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:40.316452 containerd[1511]: time="2025-01-13T21:49:40.316420922Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:40.317618 containerd[1511]: time="2025-01-13T21:49:40.317568971Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 5.257265622s" Jan 13 21:49:40.317756 containerd[1511]: time="2025-01-13T21:49:40.317729983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 13 21:49:40.320773 containerd[1511]: time="2025-01-13T21:49:40.320724912Z" level=info msg="CreateContainer within sandbox \"c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 21:49:40.338314 containerd[1511]: time="2025-01-13T21:49:40.338263062Z" level=info msg="CreateContainer within sandbox \"c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516\"" Jan 13 21:49:40.339199 containerd[1511]: time="2025-01-13T21:49:40.339077521Z" level=info msg="StartContainer for \"7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516\"" Jan 13 21:49:40.385557 systemd[1]: Started cri-containerd-7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516.scope - libcontainer container 7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516. Jan 13 21:49:40.434582 containerd[1511]: time="2025-01-13T21:49:40.434528094Z" level=info msg="StartContainer for \"7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516\" returns successfully" Jan 13 21:49:40.896560 kubelet[1917]: E0113 21:49:40.896465 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:41.028837 kubelet[1917]: E0113 21:49:41.028742 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:41.150128 systemd[1]: cri-containerd-7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516.scope: Deactivated successfully. Jan 13 21:49:41.182062 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516-rootfs.mount: Deactivated successfully. Jan 13 21:49:41.249058 kubelet[1917]: I0113 21:49:41.248722 1917 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jan 13 21:49:41.484131 containerd[1511]: time="2025-01-13T21:49:41.483885670Z" level=info msg="shim disconnected" id=7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516 namespace=k8s.io Jan 13 21:49:41.484131 containerd[1511]: time="2025-01-13T21:49:41.483975461Z" level=warning msg="cleaning up after shim disconnected" id=7a39bb605fe9dcfe909ea4b91a09d192655688e3c167a22cbfe2850e7cd86516 namespace=k8s.io Jan 13 21:49:41.484131 containerd[1511]: time="2025-01-13T21:49:41.483990122Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 21:49:41.897466 kubelet[1917]: E0113 21:49:41.897241 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:42.087779 containerd[1511]: time="2025-01-13T21:49:42.087433630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 13 21:49:42.898136 kubelet[1917]: E0113 21:49:42.897935 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:43.038737 systemd[1]: Created slice kubepods-besteffort-podad5321ad_96ea_4705_87ad_19d987280dbc.slice - libcontainer container kubepods-besteffort-podad5321ad_96ea_4705_87ad_19d987280dbc.slice. Jan 13 21:49:43.043205 containerd[1511]: time="2025-01-13T21:49:43.043140083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:0,}" Jan 13 21:49:43.139432 containerd[1511]: time="2025-01-13T21:49:43.138611622Z" level=error msg="Failed to destroy network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:43.139831 containerd[1511]: time="2025-01-13T21:49:43.139398796Z" level=error msg="encountered an error cleaning up failed sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:43.139831 containerd[1511]: time="2025-01-13T21:49:43.139779953Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:43.141575 kubelet[1917]: E0113 21:49:43.141518 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:43.141691 kubelet[1917]: E0113 21:49:43.141647 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:43.141828 kubelet[1917]: E0113 21:49:43.141693 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:43.141885 kubelet[1917]: E0113 21:49:43.141790 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:43.143062 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf-shm.mount: Deactivated successfully. Jan 13 21:49:43.898630 kubelet[1917]: E0113 21:49:43.898521 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:44.092039 kubelet[1917]: I0113 21:49:44.091981 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf" Jan 13 21:49:44.095363 containerd[1511]: time="2025-01-13T21:49:44.092940728Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:44.095363 containerd[1511]: time="2025-01-13T21:49:44.093455616Z" level=info msg="Ensure that sandbox 0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf in task-service has been cleanup successfully" Jan 13 21:49:44.096023 containerd[1511]: time="2025-01-13T21:49:44.095987192Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:44.096586 containerd[1511]: time="2025-01-13T21:49:44.096103507Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:44.098117 containerd[1511]: time="2025-01-13T21:49:44.096931862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:1,}" Jan 13 21:49:44.098516 systemd[1]: run-netns-cni\x2d111e24ce\x2d9bb3\x2d32ae\x2d2828\x2dd2179c121104.mount: Deactivated successfully. Jan 13 21:49:44.212477 containerd[1511]: time="2025-01-13T21:49:44.208660478Z" level=error msg="Failed to destroy network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:44.212477 containerd[1511]: time="2025-01-13T21:49:44.209252881Z" level=error msg="encountered an error cleaning up failed sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:44.212477 containerd[1511]: time="2025-01-13T21:49:44.209374151Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:44.212758 kubelet[1917]: E0113 21:49:44.210603 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:44.212758 kubelet[1917]: E0113 21:49:44.210719 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:44.212758 kubelet[1917]: E0113 21:49:44.210752 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:44.212941 kubelet[1917]: E0113 21:49:44.210845 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:44.214148 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f-shm.mount: Deactivated successfully. Jan 13 21:49:44.237030 update_engine[1491]: I20250113 21:49:44.236832 1491 update_attempter.cc:509] Updating boot flags... Jan 13 21:49:44.320535 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 44 scanned by (udev-worker) (2503) Jan 13 21:49:44.454305 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 44 scanned by (udev-worker) (2505) Jan 13 21:49:44.899273 kubelet[1917]: E0113 21:49:44.899204 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:45.095486 kubelet[1917]: I0113 21:49:45.095438 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f" Jan 13 21:49:45.096766 containerd[1511]: time="2025-01-13T21:49:45.096705054Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:45.097724 containerd[1511]: time="2025-01-13T21:49:45.097037468Z" level=info msg="Ensure that sandbox 5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f in task-service has been cleanup successfully" Jan 13 21:49:45.097724 containerd[1511]: time="2025-01-13T21:49:45.097489215Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:45.097724 containerd[1511]: time="2025-01-13T21:49:45.097511047Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:45.100391 containerd[1511]: time="2025-01-13T21:49:45.099449984Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:45.100391 containerd[1511]: time="2025-01-13T21:49:45.099558353Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:45.100391 containerd[1511]: time="2025-01-13T21:49:45.099576959Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:45.100391 containerd[1511]: time="2025-01-13T21:49:45.100109211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:2,}" Jan 13 21:49:45.101030 systemd[1]: run-netns-cni\x2da08c51ef\x2d5065\x2dc732\x2d3e20\x2daf56064ac19d.mount: Deactivated successfully. Jan 13 21:49:45.213354 containerd[1511]: time="2025-01-13T21:49:45.213038493Z" level=error msg="Failed to destroy network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.217227 containerd[1511]: time="2025-01-13T21:49:45.215916199Z" level=error msg="encountered an error cleaning up failed sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.217227 containerd[1511]: time="2025-01-13T21:49:45.216077829Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.217465 kubelet[1917]: E0113 21:49:45.216371 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.217465 kubelet[1917]: E0113 21:49:45.216450 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:45.217465 kubelet[1917]: E0113 21:49:45.216481 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:45.217220 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856-shm.mount: Deactivated successfully. Jan 13 21:49:45.217796 kubelet[1917]: E0113 21:49:45.216537 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:45.237632 kubelet[1917]: I0113 21:49:45.237568 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66v5\" (UniqueName: \"kubernetes.io/projected/59ee205f-19de-445d-aaed-96039972d960-kube-api-access-x66v5\") pod \"nginx-deployment-8587fbcb89-fwj99\" (UID: \"59ee205f-19de-445d-aaed-96039972d960\") " pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:45.238888 systemd[1]: Created slice kubepods-besteffort-pod59ee205f_19de_445d_aaed_96039972d960.slice - libcontainer container kubepods-besteffort-pod59ee205f_19de_445d_aaed_96039972d960.slice. Jan 13 21:49:45.545122 containerd[1511]: time="2025-01-13T21:49:45.544987886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:0,}" Jan 13 21:49:45.634124 containerd[1511]: time="2025-01-13T21:49:45.634030541Z" level=error msg="Failed to destroy network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.634680 containerd[1511]: time="2025-01-13T21:49:45.634631472Z" level=error msg="encountered an error cleaning up failed sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.634771 containerd[1511]: time="2025-01-13T21:49:45.634727760Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.635529 kubelet[1917]: E0113 21:49:45.635037 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:45.635529 kubelet[1917]: E0113 21:49:45.635113 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:45.635529 kubelet[1917]: E0113 21:49:45.635149 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:45.635746 kubelet[1917]: E0113 21:49:45.635206 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:45.902567 kubelet[1917]: E0113 21:49:45.901206 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:46.104008 kubelet[1917]: I0113 21:49:46.103964 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856" Jan 13 21:49:46.105206 containerd[1511]: time="2025-01-13T21:49:46.105156545Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:46.105704 containerd[1511]: time="2025-01-13T21:49:46.105517206Z" level=info msg="Ensure that sandbox 3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856 in task-service has been cleanup successfully" Jan 13 21:49:46.106168 containerd[1511]: time="2025-01-13T21:49:46.106120882Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:46.106168 containerd[1511]: time="2025-01-13T21:49:46.106151114Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:46.108521 systemd[1]: run-netns-cni\x2d88ee62f3\x2d86e7\x2dfba5\x2d3bcf\x2d50bbb01b2f38.mount: Deactivated successfully. Jan 13 21:49:46.110077 containerd[1511]: time="2025-01-13T21:49:46.108984633Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:46.110077 containerd[1511]: time="2025-01-13T21:49:46.109110727Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:46.110077 containerd[1511]: time="2025-01-13T21:49:46.109130404Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:46.110077 containerd[1511]: time="2025-01-13T21:49:46.109453077Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:46.110077 containerd[1511]: time="2025-01-13T21:49:46.109569632Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:46.110077 containerd[1511]: time="2025-01-13T21:49:46.109589289Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:46.111219 kubelet[1917]: I0113 21:49:46.110792 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7" Jan 13 21:49:46.111724 containerd[1511]: time="2025-01-13T21:49:46.111688887Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:46.111997 containerd[1511]: time="2025-01-13T21:49:46.111818560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:3,}" Jan 13 21:49:46.112075 containerd[1511]: time="2025-01-13T21:49:46.111932584Z" level=info msg="Ensure that sandbox 4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7 in task-service has been cleanup successfully" Jan 13 21:49:46.112424 containerd[1511]: time="2025-01-13T21:49:46.112246681Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:46.112424 containerd[1511]: time="2025-01-13T21:49:46.112275282Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:46.113026 containerd[1511]: time="2025-01-13T21:49:46.112894364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:1,}" Jan 13 21:49:46.114742 systemd[1]: run-netns-cni\x2db8eb02eb\x2d6cd6\x2dbfd1\x2db646\x2dacb2c972061e.mount: Deactivated successfully. Jan 13 21:49:46.556958 containerd[1511]: time="2025-01-13T21:49:46.556768608Z" level=error msg="Failed to destroy network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.557930 containerd[1511]: time="2025-01-13T21:49:46.557648056Z" level=error msg="encountered an error cleaning up failed sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.557930 containerd[1511]: time="2025-01-13T21:49:46.557755057Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.558177 kubelet[1917]: E0113 21:49:46.558108 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.558296 kubelet[1917]: E0113 21:49:46.558219 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:46.558296 kubelet[1917]: E0113 21:49:46.558253 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:46.558787 kubelet[1917]: E0113 21:49:46.558319 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:46.563179 containerd[1511]: time="2025-01-13T21:49:46.563039206Z" level=error msg="Failed to destroy network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.563692 containerd[1511]: time="2025-01-13T21:49:46.563643017Z" level=error msg="encountered an error cleaning up failed sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.563922 containerd[1511]: time="2025-01-13T21:49:46.563817608Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:1,} failed, error" error="failed to setup network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.564036 kubelet[1917]: E0113 21:49:46.563984 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:46.564097 kubelet[1917]: E0113 21:49:46.564034 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:46.564097 kubelet[1917]: E0113 21:49:46.564059 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:46.564233 kubelet[1917]: E0113 21:49:46.564115 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:46.887268 kubelet[1917]: E0113 21:49:46.887036 1917 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:46.903035 kubelet[1917]: E0113 21:49:46.902932 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:47.103283 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d-shm.mount: Deactivated successfully. Jan 13 21:49:47.118835 kubelet[1917]: I0113 21:49:47.118801 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d" Jan 13 21:49:47.120495 containerd[1511]: time="2025-01-13T21:49:47.120445801Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:49:47.120930 containerd[1511]: time="2025-01-13T21:49:47.120747024Z" level=info msg="Ensure that sandbox 81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d in task-service has been cleanup successfully" Jan 13 21:49:47.126039 containerd[1511]: time="2025-01-13T21:49:47.124099071Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:49:47.126039 containerd[1511]: time="2025-01-13T21:49:47.124130278Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:49:47.126039 containerd[1511]: time="2025-01-13T21:49:47.124562125Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:47.126039 containerd[1511]: time="2025-01-13T21:49:47.124692057Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:47.126039 containerd[1511]: time="2025-01-13T21:49:47.124709302Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:47.125041 systemd[1]: run-netns-cni\x2d1d8222cd\x2d3d5d\x2dbe79\x2ddc0a\x2d18da34496b00.mount: Deactivated successfully. Jan 13 21:49:47.128587 containerd[1511]: time="2025-01-13T21:49:47.127780051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:2,}" Jan 13 21:49:47.142856 kubelet[1917]: I0113 21:49:47.142756 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98" Jan 13 21:49:47.144253 containerd[1511]: time="2025-01-13T21:49:47.143979249Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:49:47.144382 containerd[1511]: time="2025-01-13T21:49:47.144256686Z" level=info msg="Ensure that sandbox 9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98 in task-service has been cleanup successfully" Jan 13 21:49:47.147919 containerd[1511]: time="2025-01-13T21:49:47.147879032Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:49:47.147919 containerd[1511]: time="2025-01-13T21:49:47.147912103Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:49:47.148502 systemd[1]: run-netns-cni\x2dd83405ed\x2d5857\x2ddf03\x2de170\x2d465a2fa00d49.mount: Deactivated successfully. Jan 13 21:49:47.152825 containerd[1511]: time="2025-01-13T21:49:47.152788820Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:47.152915 containerd[1511]: time="2025-01-13T21:49:47.152898832Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:47.152962 containerd[1511]: time="2025-01-13T21:49:47.152918307Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:47.154775 containerd[1511]: time="2025-01-13T21:49:47.154744442Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:47.154888 containerd[1511]: time="2025-01-13T21:49:47.154860475Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:47.154888 containerd[1511]: time="2025-01-13T21:49:47.154886201Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:47.162377 containerd[1511]: time="2025-01-13T21:49:47.162312975Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:47.162498 containerd[1511]: time="2025-01-13T21:49:47.162460486Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:47.162498 containerd[1511]: time="2025-01-13T21:49:47.162480161Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:47.163689 containerd[1511]: time="2025-01-13T21:49:47.163493812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:4,}" Jan 13 21:49:47.349221 containerd[1511]: time="2025-01-13T21:49:47.349010329Z" level=error msg="Failed to destroy network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.350449 containerd[1511]: time="2025-01-13T21:49:47.350008343Z" level=error msg="encountered an error cleaning up failed sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.350449 containerd[1511]: time="2025-01-13T21:49:47.350122733Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:2,} failed, error" error="failed to setup network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.350719 kubelet[1917]: E0113 21:49:47.350490 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.350719 kubelet[1917]: E0113 21:49:47.350588 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:47.350719 kubelet[1917]: E0113 21:49:47.350618 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:47.351016 kubelet[1917]: E0113 21:49:47.350694 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:47.351515 containerd[1511]: time="2025-01-13T21:49:47.351466309Z" level=error msg="Failed to destroy network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.352118 containerd[1511]: time="2025-01-13T21:49:47.352069835Z" level=error msg="encountered an error cleaning up failed sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.352192 containerd[1511]: time="2025-01-13T21:49:47.352139012Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.352446 kubelet[1917]: E0113 21:49:47.352413 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:47.352779 kubelet[1917]: E0113 21:49:47.352653 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:47.352779 kubelet[1917]: E0113 21:49:47.352686 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:47.353001 kubelet[1917]: E0113 21:49:47.352799 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:47.903459 kubelet[1917]: E0113 21:49:47.903367 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:48.106185 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a-shm.mount: Deactivated successfully. Jan 13 21:49:48.150850 kubelet[1917]: I0113 21:49:48.149544 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906" Jan 13 21:49:48.151126 containerd[1511]: time="2025-01-13T21:49:48.150651997Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:49:48.151126 containerd[1511]: time="2025-01-13T21:49:48.151017977Z" level=info msg="Ensure that sandbox 74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906 in task-service has been cleanup successfully" Jan 13 21:49:48.151905 containerd[1511]: time="2025-01-13T21:49:48.151286928Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:49:48.151905 containerd[1511]: time="2025-01-13T21:49:48.151309341Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:49:48.151905 containerd[1511]: time="2025-01-13T21:49:48.151796622Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:49:48.151905 containerd[1511]: time="2025-01-13T21:49:48.151899383Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:49:48.152102 containerd[1511]: time="2025-01-13T21:49:48.151917846Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:49:48.160045 kubelet[1917]: I0113 21:49:48.152718 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a" Jan 13 21:49:48.158872 systemd[1]: run-netns-cni\x2d2fedaa8e\x2d9f53\x2dff9b\x2d562e\x2dcfbf1f8b4356.mount: Deactivated successfully. Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.152309358Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.152488044Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.152507942Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.153313553Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.153574461Z" level=info msg="Ensure that sandbox 4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a in task-service has been cleanup successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.153753262Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.153784318Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.153854844Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.153955392Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.153973535Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.154315809Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.154442401Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.154483960Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.154552393Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.154645270Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.154665151Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.155126975Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.155241601Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.155262248Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:48.161703 containerd[1511]: time="2025-01-13T21:49:48.156241094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:5,}" Jan 13 21:49:48.161850 systemd[1]: run-netns-cni\x2df25ae1e3\x2d224b\x2d13ae\x2d70d7\x2d796449100161.mount: Deactivated successfully. Jan 13 21:49:48.167193 containerd[1511]: time="2025-01-13T21:49:48.167128140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:3,}" Jan 13 21:49:48.345833 containerd[1511]: time="2025-01-13T21:49:48.345126232Z" level=error msg="Failed to destroy network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.346092 containerd[1511]: time="2025-01-13T21:49:48.345888788Z" level=error msg="encountered an error cleaning up failed sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.346092 containerd[1511]: time="2025-01-13T21:49:48.346010680Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.347106 kubelet[1917]: E0113 21:49:48.346480 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.347106 kubelet[1917]: E0113 21:49:48.346610 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:48.347106 kubelet[1917]: E0113 21:49:48.346644 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:48.347361 kubelet[1917]: E0113 21:49:48.346727 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:48.376364 containerd[1511]: time="2025-01-13T21:49:48.376274567Z" level=error msg="Failed to destroy network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.377320 containerd[1511]: time="2025-01-13T21:49:48.377043744Z" level=error msg="encountered an error cleaning up failed sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.377320 containerd[1511]: time="2025-01-13T21:49:48.377128945Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:3,} failed, error" error="failed to setup network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.378301 kubelet[1917]: E0113 21:49:48.377702 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:48.378301 kubelet[1917]: E0113 21:49:48.377803 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:48.378301 kubelet[1917]: E0113 21:49:48.377856 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:48.378538 kubelet[1917]: E0113 21:49:48.377921 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:48.904302 kubelet[1917]: E0113 21:49:48.904193 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:49.103692 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf-shm.mount: Deactivated successfully. Jan 13 21:49:49.160582 kubelet[1917]: I0113 21:49:49.160003 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061" Jan 13 21:49:49.161190 containerd[1511]: time="2025-01-13T21:49:49.160967122Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:49:49.163508 containerd[1511]: time="2025-01-13T21:49:49.163295288Z" level=info msg="Ensure that sandbox 5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061 in task-service has been cleanup successfully" Jan 13 21:49:49.167226 containerd[1511]: time="2025-01-13T21:49:49.167106561Z" level=info msg="TearDown network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" successfully" Jan 13 21:49:49.167226 containerd[1511]: time="2025-01-13T21:49:49.167136898Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" returns successfully" Jan 13 21:49:49.167787 systemd[1]: run-netns-cni\x2d33463638\x2d2c3c\x2ddb96\x2d17de\x2da0036e131f0b.mount: Deactivated successfully. Jan 13 21:49:49.169490 containerd[1511]: time="2025-01-13T21:49:49.169456848Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:49:49.169598 containerd[1511]: time="2025-01-13T21:49:49.169572551Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:49:49.169657 containerd[1511]: time="2025-01-13T21:49:49.169598225Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:49:49.171298 containerd[1511]: time="2025-01-13T21:49:49.171271455Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:49:49.171454 containerd[1511]: time="2025-01-13T21:49:49.171415077Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:49:49.171454 containerd[1511]: time="2025-01-13T21:49:49.171444448Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:49:49.172144 containerd[1511]: time="2025-01-13T21:49:49.172101244Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:49.173385 containerd[1511]: time="2025-01-13T21:49:49.172223341Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:49.173385 containerd[1511]: time="2025-01-13T21:49:49.172440066Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:49.173385 containerd[1511]: time="2025-01-13T21:49:49.172917089Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:49.173385 containerd[1511]: time="2025-01-13T21:49:49.173024933Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:49.173385 containerd[1511]: time="2025-01-13T21:49:49.173076047Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:49.174231 containerd[1511]: time="2025-01-13T21:49:49.174198099Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:49.174454 containerd[1511]: time="2025-01-13T21:49:49.174424538Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:49.174454 containerd[1511]: time="2025-01-13T21:49:49.174451314Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:49.174946 kubelet[1917]: I0113 21:49:49.174739 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf" Jan 13 21:49:49.175624 containerd[1511]: time="2025-01-13T21:49:49.175592394Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:49:49.175940 containerd[1511]: time="2025-01-13T21:49:49.175907204Z" level=info msg="Ensure that sandbox 48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf in task-service has been cleanup successfully" Jan 13 21:49:49.177843 containerd[1511]: time="2025-01-13T21:49:49.176175547Z" level=info msg="TearDown network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" successfully" Jan 13 21:49:49.177843 containerd[1511]: time="2025-01-13T21:49:49.176205837Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" returns successfully" Jan 13 21:49:49.177843 containerd[1511]: time="2025-01-13T21:49:49.176410475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:6,}" Jan 13 21:49:49.179039 systemd[1]: run-netns-cni\x2de88dccfc\x2d68e9\x2def68\x2daf21\x2d64eb7d93d2c3.mount: Deactivated successfully. Jan 13 21:49:49.189165 containerd[1511]: time="2025-01-13T21:49:49.189044125Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:49:49.189242 containerd[1511]: time="2025-01-13T21:49:49.189163997Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:49:49.189242 containerd[1511]: time="2025-01-13T21:49:49.189183914Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:49:49.189701 containerd[1511]: time="2025-01-13T21:49:49.189653929Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:49:49.189867 containerd[1511]: time="2025-01-13T21:49:49.189838101Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:49:49.189939 containerd[1511]: time="2025-01-13T21:49:49.189865448Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:49:49.193681 containerd[1511]: time="2025-01-13T21:49:49.193630012Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:49.193932 containerd[1511]: time="2025-01-13T21:49:49.193753234Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:49.193932 containerd[1511]: time="2025-01-13T21:49:49.193778923Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:49.200645 containerd[1511]: time="2025-01-13T21:49:49.200548065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:4,}" Jan 13 21:49:49.393237 containerd[1511]: time="2025-01-13T21:49:49.393053024Z" level=error msg="Failed to destroy network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.393725 containerd[1511]: time="2025-01-13T21:49:49.393563009Z" level=error msg="encountered an error cleaning up failed sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.394006 containerd[1511]: time="2025-01-13T21:49:49.393676284Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:4,} failed, error" error="failed to setup network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.395072 kubelet[1917]: E0113 21:49:49.394207 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.395072 kubelet[1917]: E0113 21:49:49.394306 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:49.395072 kubelet[1917]: E0113 21:49:49.394367 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:49.395569 kubelet[1917]: E0113 21:49:49.394446 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:49.396091 containerd[1511]: time="2025-01-13T21:49:49.396008597Z" level=error msg="Failed to destroy network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.397625 containerd[1511]: time="2025-01-13T21:49:49.397503400Z" level=error msg="encountered an error cleaning up failed sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.397799 containerd[1511]: time="2025-01-13T21:49:49.397595989Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.398429 kubelet[1917]: E0113 21:49:49.398053 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:49.398429 kubelet[1917]: E0113 21:49:49.398159 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:49.398429 kubelet[1917]: E0113 21:49:49.398312 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:49.398995 kubelet[1917]: E0113 21:49:49.398401 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:49.905198 kubelet[1917]: E0113 21:49:49.905135 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:50.102664 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e-shm.mount: Deactivated successfully. Jan 13 21:49:50.103166 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59-shm.mount: Deactivated successfully. Jan 13 21:49:50.182591 kubelet[1917]: I0113 21:49:50.182442 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e" Jan 13 21:49:50.184454 containerd[1511]: time="2025-01-13T21:49:50.184033006Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" Jan 13 21:49:50.185011 containerd[1511]: time="2025-01-13T21:49:50.184484027Z" level=info msg="Ensure that sandbox ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e in task-service has been cleanup successfully" Jan 13 21:49:50.188173 containerd[1511]: time="2025-01-13T21:49:50.186437709Z" level=info msg="TearDown network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" successfully" Jan 13 21:49:50.188173 containerd[1511]: time="2025-01-13T21:49:50.186467504Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" returns successfully" Jan 13 21:49:50.188173 containerd[1511]: time="2025-01-13T21:49:50.187183055Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:49:50.188173 containerd[1511]: time="2025-01-13T21:49:50.187286998Z" level=info msg="TearDown network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" successfully" Jan 13 21:49:50.188173 containerd[1511]: time="2025-01-13T21:49:50.187305386Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" returns successfully" Jan 13 21:49:50.190493 containerd[1511]: time="2025-01-13T21:49:50.188390268Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:49:50.190493 containerd[1511]: time="2025-01-13T21:49:50.188567792Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:49:50.190493 containerd[1511]: time="2025-01-13T21:49:50.188616165Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:49:50.190493 containerd[1511]: time="2025-01-13T21:49:50.189094929Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:49:50.190493 containerd[1511]: time="2025-01-13T21:49:50.189216078Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:49:50.190493 containerd[1511]: time="2025-01-13T21:49:50.189292608Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:49:50.190493 containerd[1511]: time="2025-01-13T21:49:50.190383585Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:50.188678 systemd[1]: run-netns-cni\x2ddc930395\x2d34e9\x2d1aa2\x2d4201\x2df0d1ea929f71.mount: Deactivated successfully. Jan 13 21:49:50.191135 containerd[1511]: time="2025-01-13T21:49:50.190494439Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:50.191135 containerd[1511]: time="2025-01-13T21:49:50.190513129Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:50.193381 containerd[1511]: time="2025-01-13T21:49:50.193029043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:5,}" Jan 13 21:49:50.207312 kubelet[1917]: I0113 21:49:50.205094 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59" Jan 13 21:49:50.207554 containerd[1511]: time="2025-01-13T21:49:50.206294024Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" Jan 13 21:49:50.207554 containerd[1511]: time="2025-01-13T21:49:50.206643533Z" level=info msg="Ensure that sandbox 8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59 in task-service has been cleanup successfully" Jan 13 21:49:50.208506 containerd[1511]: time="2025-01-13T21:49:50.208474415Z" level=info msg="TearDown network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" successfully" Jan 13 21:49:50.208698 containerd[1511]: time="2025-01-13T21:49:50.208505802Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" returns successfully" Jan 13 21:49:50.210317 systemd[1]: run-netns-cni\x2dcf653d50\x2dfafc\x2d896f\x2deb22\x2d714cea374219.mount: Deactivated successfully. Jan 13 21:49:50.214899 containerd[1511]: time="2025-01-13T21:49:50.214824914Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:49:50.215173 containerd[1511]: time="2025-01-13T21:49:50.215144150Z" level=info msg="TearDown network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" successfully" Jan 13 21:49:50.215615 containerd[1511]: time="2025-01-13T21:49:50.215579112Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" returns successfully" Jan 13 21:49:50.217718 containerd[1511]: time="2025-01-13T21:49:50.217552813Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:49:50.217796 containerd[1511]: time="2025-01-13T21:49:50.217708739Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:49:50.217796 containerd[1511]: time="2025-01-13T21:49:50.217737060Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:49:50.218117 containerd[1511]: time="2025-01-13T21:49:50.218088637Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:49:50.218548 containerd[1511]: time="2025-01-13T21:49:50.218204315Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:49:50.218548 containerd[1511]: time="2025-01-13T21:49:50.218230653Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:49:50.220348 containerd[1511]: time="2025-01-13T21:49:50.220029117Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:50.220434 containerd[1511]: time="2025-01-13T21:49:50.220362385Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:50.220434 containerd[1511]: time="2025-01-13T21:49:50.220383911Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:50.221451 containerd[1511]: time="2025-01-13T21:49:50.220945282Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:50.221451 containerd[1511]: time="2025-01-13T21:49:50.221149600Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:50.221451 containerd[1511]: time="2025-01-13T21:49:50.221180288Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:50.221751 containerd[1511]: time="2025-01-13T21:49:50.221603669Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:50.221806 containerd[1511]: time="2025-01-13T21:49:50.221759422Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:50.221806 containerd[1511]: time="2025-01-13T21:49:50.221779568Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:50.226609 containerd[1511]: time="2025-01-13T21:49:50.226131740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:7,}" Jan 13 21:49:50.431942 containerd[1511]: time="2025-01-13T21:49:50.431752794Z" level=error msg="Failed to destroy network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.433083 containerd[1511]: time="2025-01-13T21:49:50.432515049Z" level=error msg="encountered an error cleaning up failed sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.433083 containerd[1511]: time="2025-01-13T21:49:50.432593270Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.433961 kubelet[1917]: E0113 21:49:50.433388 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.433961 kubelet[1917]: E0113 21:49:50.433691 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:50.433961 kubelet[1917]: E0113 21:49:50.433745 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:50.434624 kubelet[1917]: E0113 21:49:50.433921 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:50.463420 containerd[1511]: time="2025-01-13T21:49:50.463279947Z" level=error msg="Failed to destroy network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.464539 containerd[1511]: time="2025-01-13T21:49:50.464494322Z" level=error msg="encountered an error cleaning up failed sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.464945 containerd[1511]: time="2025-01-13T21:49:50.464690250Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:5,} failed, error" error="failed to setup network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.465080 kubelet[1917]: E0113 21:49:50.464970 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:50.465080 kubelet[1917]: E0113 21:49:50.465060 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:50.465363 kubelet[1917]: E0113 21:49:50.465089 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:50.465363 kubelet[1917]: E0113 21:49:50.465156 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:50.905583 kubelet[1917]: E0113 21:49:50.905497 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:51.103480 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8-shm.mount: Deactivated successfully. Jan 13 21:49:51.214571 kubelet[1917]: I0113 21:49:51.214447 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0" Jan 13 21:49:51.215468 containerd[1511]: time="2025-01-13T21:49:51.215417520Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\"" Jan 13 21:49:51.215999 containerd[1511]: time="2025-01-13T21:49:51.215661667Z" level=info msg="Ensure that sandbox 39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0 in task-service has been cleanup successfully" Jan 13 21:49:51.217023 containerd[1511]: time="2025-01-13T21:49:51.216976992Z" level=info msg="TearDown network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" successfully" Jan 13 21:49:51.217023 containerd[1511]: time="2025-01-13T21:49:51.217007925Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" returns successfully" Jan 13 21:49:51.219345 containerd[1511]: time="2025-01-13T21:49:51.218563463Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" Jan 13 21:49:51.219345 containerd[1511]: time="2025-01-13T21:49:51.218669954Z" level=info msg="TearDown network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" successfully" Jan 13 21:49:51.219345 containerd[1511]: time="2025-01-13T21:49:51.218688595Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" returns successfully" Jan 13 21:49:51.219862 containerd[1511]: time="2025-01-13T21:49:51.219824402Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:49:51.220723 containerd[1511]: time="2025-01-13T21:49:51.219937397Z" level=info msg="TearDown network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" successfully" Jan 13 21:49:51.220723 containerd[1511]: time="2025-01-13T21:49:51.219962195Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" returns successfully" Jan 13 21:49:51.220160 systemd[1]: run-netns-cni\x2d8fcd8898\x2d0fa5\x2df8eb\x2d04e8\x2d5edf069ddbea.mount: Deactivated successfully. Jan 13 21:49:51.221170 containerd[1511]: time="2025-01-13T21:49:51.220929799Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:49:51.221170 containerd[1511]: time="2025-01-13T21:49:51.221046709Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:49:51.221170 containerd[1511]: time="2025-01-13T21:49:51.221065412Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:49:51.222493 containerd[1511]: time="2025-01-13T21:49:51.222087885Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:49:51.222493 containerd[1511]: time="2025-01-13T21:49:51.222194087Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:49:51.222493 containerd[1511]: time="2025-01-13T21:49:51.222213611Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:49:51.222973 containerd[1511]: time="2025-01-13T21:49:51.222821723Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:51.222973 containerd[1511]: time="2025-01-13T21:49:51.222927060Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:51.222973 containerd[1511]: time="2025-01-13T21:49:51.222945183Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:51.224346 containerd[1511]: time="2025-01-13T21:49:51.223404931Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:51.224346 containerd[1511]: time="2025-01-13T21:49:51.223543571Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:51.224346 containerd[1511]: time="2025-01-13T21:49:51.223570168Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:51.224346 containerd[1511]: time="2025-01-13T21:49:51.224091919Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:51.224346 containerd[1511]: time="2025-01-13T21:49:51.224188552Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:51.224346 containerd[1511]: time="2025-01-13T21:49:51.224205928Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:51.224852 containerd[1511]: time="2025-01-13T21:49:51.224819928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:8,}" Jan 13 21:49:51.227189 kubelet[1917]: I0113 21:49:51.227162 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8" Jan 13 21:49:51.227829 containerd[1511]: time="2025-01-13T21:49:51.227801445Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\"" Jan 13 21:49:51.228499 containerd[1511]: time="2025-01-13T21:49:51.228469210Z" level=info msg="Ensure that sandbox 9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8 in task-service has been cleanup successfully" Jan 13 21:49:51.230989 systemd[1]: run-netns-cni\x2d612c5180\x2ddc04\x2dcd82\x2deaf5\x2d72de3d4fd691.mount: Deactivated successfully. Jan 13 21:49:51.231216 containerd[1511]: time="2025-01-13T21:49:51.231187991Z" level=info msg="TearDown network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" successfully" Jan 13 21:49:51.231320 containerd[1511]: time="2025-01-13T21:49:51.231296814Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" returns successfully" Jan 13 21:49:51.233483 containerd[1511]: time="2025-01-13T21:49:51.233360856Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" Jan 13 21:49:51.233704 containerd[1511]: time="2025-01-13T21:49:51.233677743Z" level=info msg="TearDown network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" successfully" Jan 13 21:49:51.234142 containerd[1511]: time="2025-01-13T21:49:51.234115851Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" returns successfully" Jan 13 21:49:51.236385 containerd[1511]: time="2025-01-13T21:49:51.236356379Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:49:51.236584 containerd[1511]: time="2025-01-13T21:49:51.236558445Z" level=info msg="TearDown network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" successfully" Jan 13 21:49:51.236814 containerd[1511]: time="2025-01-13T21:49:51.236673538Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" returns successfully" Jan 13 21:49:51.237847 containerd[1511]: time="2025-01-13T21:49:51.237815839Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:49:51.238221 containerd[1511]: time="2025-01-13T21:49:51.237940309Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:49:51.238221 containerd[1511]: time="2025-01-13T21:49:51.237966700Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:49:51.238529 containerd[1511]: time="2025-01-13T21:49:51.238403566Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:49:51.238840 containerd[1511]: time="2025-01-13T21:49:51.238524649Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:49:51.238840 containerd[1511]: time="2025-01-13T21:49:51.238565380Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:49:51.239356 containerd[1511]: time="2025-01-13T21:49:51.239145048Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:51.239356 containerd[1511]: time="2025-01-13T21:49:51.239252922Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:51.239356 containerd[1511]: time="2025-01-13T21:49:51.239273289Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:51.248273 containerd[1511]: time="2025-01-13T21:49:51.248003789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:6,}" Jan 13 21:49:51.400576 containerd[1511]: time="2025-01-13T21:49:51.400510673Z" level=error msg="Failed to destroy network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.401483 containerd[1511]: time="2025-01-13T21:49:51.401173097Z" level=error msg="encountered an error cleaning up failed sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.401483 containerd[1511]: time="2025-01-13T21:49:51.401249069Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:8,} failed, error" error="failed to setup network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.401614 kubelet[1917]: E0113 21:49:51.401512 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.401614 kubelet[1917]: E0113 21:49:51.401579 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:51.401730 kubelet[1917]: E0113 21:49:51.401607 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:51.401730 kubelet[1917]: E0113 21:49:51.401666 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:51.428614 containerd[1511]: time="2025-01-13T21:49:51.427474492Z" level=error msg="Failed to destroy network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.428614 containerd[1511]: time="2025-01-13T21:49:51.428239217Z" level=error msg="encountered an error cleaning up failed sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.428976 containerd[1511]: time="2025-01-13T21:49:51.428309085Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:6,} failed, error" error="failed to setup network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.429290 kubelet[1917]: E0113 21:49:51.429229 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:51.429416 kubelet[1917]: E0113 21:49:51.429308 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:51.429416 kubelet[1917]: E0113 21:49:51.429356 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:51.429512 kubelet[1917]: E0113 21:49:51.429418 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:51.906103 kubelet[1917]: E0113 21:49:51.906010 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:52.102813 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa-shm.mount: Deactivated successfully. Jan 13 21:49:52.103150 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42-shm.mount: Deactivated successfully. Jan 13 21:49:52.231002 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1192118063.mount: Deactivated successfully. Jan 13 21:49:52.238378 kubelet[1917]: I0113 21:49:52.237707 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42" Jan 13 21:49:52.238662 containerd[1511]: time="2025-01-13T21:49:52.238617481Z" level=info msg="StopPodSandbox for \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\"" Jan 13 21:49:52.239275 containerd[1511]: time="2025-01-13T21:49:52.239186769Z" level=info msg="Ensure that sandbox 710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42 in task-service has been cleanup successfully" Jan 13 21:49:52.241026 systemd[1]: run-netns-cni\x2d09a0bed4\x2d37e8\x2dd557\x2d9b3f\x2ddac006f2b7b2.mount: Deactivated successfully. Jan 13 21:49:52.243149 containerd[1511]: time="2025-01-13T21:49:52.243110440Z" level=info msg="TearDown network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" successfully" Jan 13 21:49:52.243238 containerd[1511]: time="2025-01-13T21:49:52.243147055Z" level=info msg="StopPodSandbox for \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" returns successfully" Jan 13 21:49:52.247640 containerd[1511]: time="2025-01-13T21:49:52.247604776Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\"" Jan 13 21:49:52.247751 containerd[1511]: time="2025-01-13T21:49:52.247724733Z" level=info msg="TearDown network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" successfully" Jan 13 21:49:52.247813 containerd[1511]: time="2025-01-13T21:49:52.247751412Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" returns successfully" Jan 13 21:49:52.249244 containerd[1511]: time="2025-01-13T21:49:52.248858152Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" Jan 13 21:49:52.249244 containerd[1511]: time="2025-01-13T21:49:52.248969817Z" level=info msg="TearDown network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" successfully" Jan 13 21:49:52.249244 containerd[1511]: time="2025-01-13T21:49:52.248988728Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" returns successfully" Jan 13 21:49:52.252351 containerd[1511]: time="2025-01-13T21:49:52.251079100Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:49:52.252351 containerd[1511]: time="2025-01-13T21:49:52.251211139Z" level=info msg="TearDown network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" successfully" Jan 13 21:49:52.252351 containerd[1511]: time="2025-01-13T21:49:52.251230983Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" returns successfully" Jan 13 21:49:52.252351 containerd[1511]: time="2025-01-13T21:49:52.251747912Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:49:52.252351 containerd[1511]: time="2025-01-13T21:49:52.251849892Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:49:52.252351 containerd[1511]: time="2025-01-13T21:49:52.251869019Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:49:52.253318 containerd[1511]: time="2025-01-13T21:49:52.253283527Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:49:52.253508 containerd[1511]: time="2025-01-13T21:49:52.253479361Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:49:52.253508 containerd[1511]: time="2025-01-13T21:49:52.253504565Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:49:52.254242 containerd[1511]: time="2025-01-13T21:49:52.254209592Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:52.254589 containerd[1511]: time="2025-01-13T21:49:52.254490349Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:52.254760 containerd[1511]: time="2025-01-13T21:49:52.254735752Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:52.256197 containerd[1511]: time="2025-01-13T21:49:52.256154132Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:52.256414 containerd[1511]: time="2025-01-13T21:49:52.256387374Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:52.256524 containerd[1511]: time="2025-01-13T21:49:52.256501102Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:52.257097 containerd[1511]: time="2025-01-13T21:49:52.257068439Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:52.257312 containerd[1511]: time="2025-01-13T21:49:52.257286303Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:52.257798 containerd[1511]: time="2025-01-13T21:49:52.257425557Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:52.257924 kubelet[1917]: I0113 21:49:52.257635 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa" Jan 13 21:49:52.258816 containerd[1511]: time="2025-01-13T21:49:52.258390354Z" level=info msg="StopPodSandbox for \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\"" Jan 13 21:49:52.258816 containerd[1511]: time="2025-01-13T21:49:52.258602325Z" level=info msg="Ensure that sandbox db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa in task-service has been cleanup successfully" Jan 13 21:49:52.259011 containerd[1511]: time="2025-01-13T21:49:52.258985387Z" level=info msg="TearDown network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" successfully" Jan 13 21:49:52.259170 containerd[1511]: time="2025-01-13T21:49:52.259144573Z" level=info msg="StopPodSandbox for \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" returns successfully" Jan 13 21:49:52.259402 containerd[1511]: time="2025-01-13T21:49:52.259374695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:9,}" Jan 13 21:49:52.261113 systemd[1]: run-netns-cni\x2d5edc80a0\x2df7ca\x2d7da7\x2d57c1\x2d82aae1618da3.mount: Deactivated successfully. Jan 13 21:49:52.262293 containerd[1511]: time="2025-01-13T21:49:52.262089471Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\"" Jan 13 21:49:52.262293 containerd[1511]: time="2025-01-13T21:49:52.262206614Z" level=info msg="TearDown network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" successfully" Jan 13 21:49:52.262293 containerd[1511]: time="2025-01-13T21:49:52.262225379Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" returns successfully" Jan 13 21:49:52.263036 containerd[1511]: time="2025-01-13T21:49:52.262779885Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" Jan 13 21:49:52.263036 containerd[1511]: time="2025-01-13T21:49:52.262922673Z" level=info msg="TearDown network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" successfully" Jan 13 21:49:52.263036 containerd[1511]: time="2025-01-13T21:49:52.262941937Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" returns successfully" Jan 13 21:49:52.264143 containerd[1511]: time="2025-01-13T21:49:52.263921024Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:49:52.264143 containerd[1511]: time="2025-01-13T21:49:52.264017857Z" level=info msg="TearDown network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" successfully" Jan 13 21:49:52.264143 containerd[1511]: time="2025-01-13T21:49:52.264047839Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" returns successfully" Jan 13 21:49:52.266684 containerd[1511]: time="2025-01-13T21:49:52.266483150Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:49:52.266684 containerd[1511]: time="2025-01-13T21:49:52.266586385Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:49:52.266684 containerd[1511]: time="2025-01-13T21:49:52.266604496Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:49:52.267352 containerd[1511]: time="2025-01-13T21:49:52.267126351Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:49:52.267352 containerd[1511]: time="2025-01-13T21:49:52.267251534Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:49:52.267352 containerd[1511]: time="2025-01-13T21:49:52.267270448Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:49:52.268147 containerd[1511]: time="2025-01-13T21:49:52.267701384Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:52.268147 containerd[1511]: time="2025-01-13T21:49:52.267802394Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:52.268147 containerd[1511]: time="2025-01-13T21:49:52.267820677Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:52.268728 containerd[1511]: time="2025-01-13T21:49:52.268457296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:7,}" Jan 13 21:49:52.314966 containerd[1511]: time="2025-01-13T21:49:52.314917523Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:52.316068 containerd[1511]: time="2025-01-13T21:49:52.316012305Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 13 21:49:52.319382 containerd[1511]: time="2025-01-13T21:49:52.319350042Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:52.346014 containerd[1511]: time="2025-01-13T21:49:52.345869522Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:52.347097 containerd[1511]: time="2025-01-13T21:49:52.346986477Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 10.259490399s" Jan 13 21:49:52.347097 containerd[1511]: time="2025-01-13T21:49:52.347031229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 13 21:49:52.382310 containerd[1511]: time="2025-01-13T21:49:52.381770454Z" level=info msg="CreateContainer within sandbox \"c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 21:49:52.403577 containerd[1511]: time="2025-01-13T21:49:52.403524603Z" level=info msg="CreateContainer within sandbox \"c03bd20d2a9cef97acdf2032e390602aaf571de8b383e40e3e2f3a194594d101\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"eff1cf7bacd02cff43066529bc97e6abeab7398862488ceabb152a3f7346cdad\"" Jan 13 21:49:52.405405 containerd[1511]: time="2025-01-13T21:49:52.405287642Z" level=info msg="StartContainer for \"eff1cf7bacd02cff43066529bc97e6abeab7398862488ceabb152a3f7346cdad\"" Jan 13 21:49:52.441149 containerd[1511]: time="2025-01-13T21:49:52.441087939Z" level=error msg="Failed to destroy network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.441813 containerd[1511]: time="2025-01-13T21:49:52.441777783Z" level=error msg="encountered an error cleaning up failed sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.441988 containerd[1511]: time="2025-01-13T21:49:52.441953708Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:9,} failed, error" error="failed to setup network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.442441 kubelet[1917]: E0113 21:49:52.442368 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.442569 kubelet[1917]: E0113 21:49:52.442446 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:52.442569 kubelet[1917]: E0113 21:49:52.442480 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dwq7q" Jan 13 21:49:52.442569 kubelet[1917]: E0113 21:49:52.442533 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dwq7q_calico-system(ad5321ad-96ea-4705-87ad-19d987280dbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dwq7q" podUID="ad5321ad-96ea-4705-87ad-19d987280dbc" Jan 13 21:49:52.448096 containerd[1511]: time="2025-01-13T21:49:52.447315839Z" level=error msg="Failed to destroy network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.448096 containerd[1511]: time="2025-01-13T21:49:52.447817180Z" level=error msg="encountered an error cleaning up failed sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.448096 containerd[1511]: time="2025-01-13T21:49:52.447897302Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:7,} failed, error" error="failed to setup network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.448554 kubelet[1917]: E0113 21:49:52.448171 1917 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:49:52.448554 kubelet[1917]: E0113 21:49:52.448252 1917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:52.448554 kubelet[1917]: E0113 21:49:52.448279 1917 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-8587fbcb89-fwj99" Jan 13 21:49:52.448727 kubelet[1917]: E0113 21:49:52.448342 1917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-8587fbcb89-fwj99_default(59ee205f-19de-445d-aaed-96039972d960)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-8587fbcb89-fwj99" podUID="59ee205f-19de-445d-aaed-96039972d960" Jan 13 21:49:52.514526 systemd[1]: Started cri-containerd-eff1cf7bacd02cff43066529bc97e6abeab7398862488ceabb152a3f7346cdad.scope - libcontainer container eff1cf7bacd02cff43066529bc97e6abeab7398862488ceabb152a3f7346cdad. Jan 13 21:49:52.712861 containerd[1511]: time="2025-01-13T21:49:52.712774571Z" level=info msg="StartContainer for \"eff1cf7bacd02cff43066529bc97e6abeab7398862488ceabb152a3f7346cdad\" returns successfully" Jan 13 21:49:52.762564 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 21:49:52.762718 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 21:49:52.907615 kubelet[1917]: E0113 21:49:52.907444 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:53.106981 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae-shm.mount: Deactivated successfully. Jan 13 21:49:53.272435 kubelet[1917]: I0113 21:49:53.272382 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae" Jan 13 21:49:53.273707 containerd[1511]: time="2025-01-13T21:49:53.273468385Z" level=info msg="StopPodSandbox for \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\"" Jan 13 21:49:53.274213 containerd[1511]: time="2025-01-13T21:49:53.273804481Z" level=info msg="Ensure that sandbox f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae in task-service has been cleanup successfully" Jan 13 21:49:53.278950 containerd[1511]: time="2025-01-13T21:49:53.278356189Z" level=info msg="TearDown network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\" successfully" Jan 13 21:49:53.278950 containerd[1511]: time="2025-01-13T21:49:53.278384992Z" level=info msg="StopPodSandbox for \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\" returns successfully" Jan 13 21:49:53.278950 containerd[1511]: time="2025-01-13T21:49:53.278813943Z" level=info msg="StopPodSandbox for \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\"" Jan 13 21:49:53.278950 containerd[1511]: time="2025-01-13T21:49:53.278913385Z" level=info msg="TearDown network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" successfully" Jan 13 21:49:53.278950 containerd[1511]: time="2025-01-13T21:49:53.278931106Z" level=info msg="StopPodSandbox for \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" returns successfully" Jan 13 21:49:53.279482 systemd[1]: run-netns-cni\x2d53a4f866\x2d2aed\x2daa41\x2d5e5d\x2dba0dd4308527.mount: Deactivated successfully. Jan 13 21:49:53.281888 containerd[1511]: time="2025-01-13T21:49:53.280700720Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\"" Jan 13 21:49:53.282617 containerd[1511]: time="2025-01-13T21:49:53.280812318Z" level=info msg="TearDown network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" successfully" Jan 13 21:49:53.282617 containerd[1511]: time="2025-01-13T21:49:53.282412957Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" returns successfully" Jan 13 21:49:53.283289 containerd[1511]: time="2025-01-13T21:49:53.283060410Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" Jan 13 21:49:53.283289 containerd[1511]: time="2025-01-13T21:49:53.283182014Z" level=info msg="TearDown network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" successfully" Jan 13 21:49:53.283289 containerd[1511]: time="2025-01-13T21:49:53.283211904Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" returns successfully" Jan 13 21:49:53.283861 containerd[1511]: time="2025-01-13T21:49:53.283611951Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:49:53.283861 containerd[1511]: time="2025-01-13T21:49:53.283720796Z" level=info msg="TearDown network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" successfully" Jan 13 21:49:53.283861 containerd[1511]: time="2025-01-13T21:49:53.283738957Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" returns successfully" Jan 13 21:49:53.284701 containerd[1511]: time="2025-01-13T21:49:53.284575240Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:49:53.285000 kubelet[1917]: I0113 21:49:53.284967 1917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8" Jan 13 21:49:53.285300 containerd[1511]: time="2025-01-13T21:49:53.285228055Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:49:53.285610 containerd[1511]: time="2025-01-13T21:49:53.285535187Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:49:53.285809 containerd[1511]: time="2025-01-13T21:49:53.285751336Z" level=info msg="StopPodSandbox for \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\"" Jan 13 21:49:53.286007 containerd[1511]: time="2025-01-13T21:49:53.285979980Z" level=info msg="Ensure that sandbox b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8 in task-service has been cleanup successfully" Jan 13 21:49:53.286703 containerd[1511]: time="2025-01-13T21:49:53.286646141Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:49:53.287046 containerd[1511]: time="2025-01-13T21:49:53.286928617Z" level=info msg="TearDown network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\" successfully" Jan 13 21:49:53.287046 containerd[1511]: time="2025-01-13T21:49:53.286984061Z" level=info msg="StopPodSandbox for \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\" returns successfully" Jan 13 21:49:53.287564 containerd[1511]: time="2025-01-13T21:49:53.287381662Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:49:53.287564 containerd[1511]: time="2025-01-13T21:49:53.287410848Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:49:53.288397 containerd[1511]: time="2025-01-13T21:49:53.288054569Z" level=info msg="StopPodSandbox for \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\"" Jan 13 21:49:53.288397 containerd[1511]: time="2025-01-13T21:49:53.288164010Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:49:53.291987 kubelet[1917]: I0113 21:49:53.291846 1917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-tps9v" podStartSLOduration=4.333267854 podStartE2EDuration="26.291818773s" podCreationTimestamp="2025-01-13 21:49:27 +0000 UTC" firstStartedPulling="2025-01-13 21:49:30.391070761 +0000 UTC m=+4.087315738" lastFinishedPulling="2025-01-13 21:49:52.349621673 +0000 UTC m=+26.045866657" observedRunningTime="2025-01-13 21:49:53.288015645 +0000 UTC m=+26.984260634" watchObservedRunningTime="2025-01-13 21:49:53.291818773 +0000 UTC m=+26.988063756" Jan 13 21:49:53.292226 systemd[1]: run-netns-cni\x2dfb33c693\x2d3b79\x2d5abe\x2d0ddd\x2dd4dc53b2b0c6.mount: Deactivated successfully. Jan 13 21:49:53.300035 containerd[1511]: time="2025-01-13T21:49:53.299627434Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:49:53.300035 containerd[1511]: time="2025-01-13T21:49:53.299666305Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:49:53.300574 containerd[1511]: time="2025-01-13T21:49:53.288258140Z" level=info msg="TearDown network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" successfully" Jan 13 21:49:53.300574 containerd[1511]: time="2025-01-13T21:49:53.300541613Z" level=info msg="StopPodSandbox for \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" returns successfully" Jan 13 21:49:53.301940 containerd[1511]: time="2025-01-13T21:49:53.301448642Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\"" Jan 13 21:49:53.301940 containerd[1511]: time="2025-01-13T21:49:53.301824782Z" level=info msg="TearDown network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" successfully" Jan 13 21:49:53.302427 containerd[1511]: time="2025-01-13T21:49:53.302267203Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" returns successfully" Jan 13 21:49:53.303420 containerd[1511]: time="2025-01-13T21:49:53.303192119Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:49:53.304518 containerd[1511]: time="2025-01-13T21:49:53.304394138Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:49:53.304518 containerd[1511]: time="2025-01-13T21:49:53.304455977Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:49:53.305025 containerd[1511]: time="2025-01-13T21:49:53.304870829Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" Jan 13 21:49:53.306174 containerd[1511]: time="2025-01-13T21:49:53.306124768Z" level=info msg="TearDown network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" successfully" Jan 13 21:49:53.306452 containerd[1511]: time="2025-01-13T21:49:53.306287388Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" returns successfully" Jan 13 21:49:53.312992 containerd[1511]: time="2025-01-13T21:49:53.312956969Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:49:53.313482 containerd[1511]: time="2025-01-13T21:49:53.313208733Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:49:53.313482 containerd[1511]: time="2025-01-13T21:49:53.313299312Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:49:53.313482 containerd[1511]: time="2025-01-13T21:49:53.313422776Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:49:53.314268 containerd[1511]: time="2025-01-13T21:49:53.313755327Z" level=info msg="TearDown network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" successfully" Jan 13 21:49:53.314268 containerd[1511]: time="2025-01-13T21:49:53.313789309Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" returns successfully" Jan 13 21:49:53.314941 containerd[1511]: time="2025-01-13T21:49:53.314913101Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:49:53.315497 containerd[1511]: time="2025-01-13T21:49:53.315470614Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:49:53.316260 containerd[1511]: time="2025-01-13T21:49:53.315628001Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:49:53.316260 containerd[1511]: time="2025-01-13T21:49:53.315111619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:10,}" Jan 13 21:49:53.317108 containerd[1511]: time="2025-01-13T21:49:53.316787126Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:49:53.317374 containerd[1511]: time="2025-01-13T21:49:53.317288475Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:49:53.317506 containerd[1511]: time="2025-01-13T21:49:53.317479329Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:49:53.318020 containerd[1511]: time="2025-01-13T21:49:53.317991189Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:49:53.318203 containerd[1511]: time="2025-01-13T21:49:53.318178597Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:49:53.318297 containerd[1511]: time="2025-01-13T21:49:53.318274444Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:49:53.319639 containerd[1511]: time="2025-01-13T21:49:53.319611865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:8,}" Jan 13 21:49:53.563808 systemd-networkd[1419]: cali747f6eee4e5: Link UP Jan 13 21:49:53.564182 systemd-networkd[1419]: cali747f6eee4e5: Gained carrier Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.408 [INFO][3092] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.442 [INFO][3092] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0 nginx-deployment-8587fbcb89- default 59ee205f-19de-445d-aaed-96039972d960 1211 0 2025-01-13 21:49:45 +0000 UTC map[app:nginx pod-template-hash:8587fbcb89 projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 10.230.9.94 nginx-deployment-8587fbcb89-fwj99 eth0 default [] [] [kns.default ksa.default.default] cali747f6eee4e5 [] []}} ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.442 [INFO][3092] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.494 [INFO][3108] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" HandleID="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Workload="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.510 [INFO][3108] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" HandleID="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Workload="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001fc1b0), Attrs:map[string]string{"namespace":"default", "node":"10.230.9.94", "pod":"nginx-deployment-8587fbcb89-fwj99", "timestamp":"2025-01-13 21:49:53.493972197 +0000 UTC"}, Hostname:"10.230.9.94", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.510 [INFO][3108] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.510 [INFO][3108] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.510 [INFO][3108] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.230.9.94' Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.513 [INFO][3108] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.518 [INFO][3108] ipam/ipam.go 372: Looking up existing affinities for host host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.525 [INFO][3108] ipam/ipam.go 489: Trying affinity for 192.168.49.64/26 host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.528 [INFO][3108] ipam/ipam.go 155: Attempting to load block cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.531 [INFO][3108] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.531 [INFO][3108] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.49.64/26 handle="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.534 [INFO][3108] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32 Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.542 [INFO][3108] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.49.64/26 handle="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.550 [INFO][3108] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.49.65/26] block=192.168.49.64/26 handle="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.550 [INFO][3108] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.49.65/26] handle="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" host="10.230.9.94" Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.550 [INFO][3108] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:49:53.576524 containerd[1511]: 2025-01-13 21:49:53.550 [INFO][3108] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.49.65/26] IPv6=[] ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" HandleID="k8s-pod-network.536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Workload="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" Jan 13 21:49:53.577947 containerd[1511]: 2025-01-13 21:49:53.553 [INFO][3092] cni-plugin/k8s.go 386: Populated endpoint ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0", GenerateName:"nginx-deployment-8587fbcb89-", Namespace:"default", SelfLink:"", UID:"59ee205f-19de-445d-aaed-96039972d960", ResourceVersion:"1211", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 49, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"8587fbcb89", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"", Pod:"nginx-deployment-8587fbcb89-fwj99", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.49.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali747f6eee4e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:49:53.577947 containerd[1511]: 2025-01-13 21:49:53.553 [INFO][3092] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.49.65/32] ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" Jan 13 21:49:53.577947 containerd[1511]: 2025-01-13 21:49:53.554 [INFO][3092] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali747f6eee4e5 ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" Jan 13 21:49:53.577947 containerd[1511]: 2025-01-13 21:49:53.565 [INFO][3092] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" Jan 13 21:49:53.577947 containerd[1511]: 2025-01-13 21:49:53.566 [INFO][3092] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0", GenerateName:"nginx-deployment-8587fbcb89-", Namespace:"default", SelfLink:"", UID:"59ee205f-19de-445d-aaed-96039972d960", ResourceVersion:"1211", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 49, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"8587fbcb89", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32", Pod:"nginx-deployment-8587fbcb89-fwj99", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.49.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali747f6eee4e5", MAC:"a2:a0:c2:26:41:b1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:49:53.577947 containerd[1511]: 2025-01-13 21:49:53.573 [INFO][3092] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32" Namespace="default" Pod="nginx-deployment-8587fbcb89-fwj99" WorkloadEndpoint="10.230.9.94-k8s-nginx--deployment--8587fbcb89--fwj99-eth0" Jan 13 21:49:53.611735 containerd[1511]: time="2025-01-13T21:49:53.611605303Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:49:53.611963 containerd[1511]: time="2025-01-13T21:49:53.611747003Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:49:53.611963 containerd[1511]: time="2025-01-13T21:49:53.611802460Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:53.612198 containerd[1511]: time="2025-01-13T21:49:53.611958888Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:53.639571 systemd[1]: Started cri-containerd-536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32.scope - libcontainer container 536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32. Jan 13 21:49:53.662749 systemd-networkd[1419]: cali88d89b2188a: Link UP Jan 13 21:49:53.665471 systemd-networkd[1419]: cali88d89b2188a: Gained carrier Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.406 [INFO][3083] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.442 [INFO][3083] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.230.9.94-k8s-csi--node--driver--dwq7q-eth0 csi-node-driver- calico-system ad5321ad-96ea-4705-87ad-19d987280dbc 1132 0 2025-01-13 21:49:27 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:56747c9949 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s 10.230.9.94 csi-node-driver-dwq7q eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali88d89b2188a [] []}} ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.442 [INFO][3083] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.497 [INFO][3112] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" HandleID="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Workload="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.512 [INFO][3112] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" HandleID="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Workload="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319350), Attrs:map[string]string{"namespace":"calico-system", "node":"10.230.9.94", "pod":"csi-node-driver-dwq7q", "timestamp":"2025-01-13 21:49:53.497154966 +0000 UTC"}, Hostname:"10.230.9.94", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.512 [INFO][3112] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.550 [INFO][3112] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.550 [INFO][3112] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.230.9.94' Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.614 [INFO][3112] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.624 [INFO][3112] ipam/ipam.go 372: Looking up existing affinities for host host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.631 [INFO][3112] ipam/ipam.go 489: Trying affinity for 192.168.49.64/26 host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.635 [INFO][3112] ipam/ipam.go 155: Attempting to load block cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.638 [INFO][3112] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.638 [INFO][3112] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.49.64/26 handle="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.641 [INFO][3112] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9 Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.646 [INFO][3112] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.49.64/26 handle="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.653 [INFO][3112] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.49.66/26] block=192.168.49.64/26 handle="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.653 [INFO][3112] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.49.66/26] handle="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" host="10.230.9.94" Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.653 [INFO][3112] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:49:53.689584 containerd[1511]: 2025-01-13 21:49:53.653 [INFO][3112] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.49.66/26] IPv6=[] ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" HandleID="k8s-pod-network.2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Workload="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" Jan 13 21:49:53.692114 containerd[1511]: 2025-01-13 21:49:53.656 [INFO][3083] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-csi--node--driver--dwq7q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ad5321ad-96ea-4705-87ad-19d987280dbc", ResourceVersion:"1132", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 49, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"", Pod:"csi-node-driver-dwq7q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.49.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali88d89b2188a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:49:53.692114 containerd[1511]: 2025-01-13 21:49:53.656 [INFO][3083] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.49.66/32] ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" Jan 13 21:49:53.692114 containerd[1511]: 2025-01-13 21:49:53.656 [INFO][3083] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali88d89b2188a ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" Jan 13 21:49:53.692114 containerd[1511]: 2025-01-13 21:49:53.666 [INFO][3083] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" Jan 13 21:49:53.692114 containerd[1511]: 2025-01-13 21:49:53.668 [INFO][3083] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-csi--node--driver--dwq7q-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ad5321ad-96ea-4705-87ad-19d987280dbc", ResourceVersion:"1132", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 49, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9", Pod:"csi-node-driver-dwq7q", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.49.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali88d89b2188a", MAC:"a6:2c:be:7c:4f:b3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:49:53.692114 containerd[1511]: 2025-01-13 21:49:53.687 [INFO][3083] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9" Namespace="calico-system" Pod="csi-node-driver-dwq7q" WorkloadEndpoint="10.230.9.94-k8s-csi--node--driver--dwq7q-eth0" Jan 13 21:49:53.711765 containerd[1511]: time="2025-01-13T21:49:53.711716916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-8587fbcb89-fwj99,Uid:59ee205f-19de-445d-aaed-96039972d960,Namespace:default,Attempt:8,} returns sandbox id \"536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32\"" Jan 13 21:49:53.714434 containerd[1511]: time="2025-01-13T21:49:53.713970923Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Jan 13 21:49:53.739850 containerd[1511]: time="2025-01-13T21:49:53.739705967Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:49:53.739850 containerd[1511]: time="2025-01-13T21:49:53.739774693Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:49:53.739850 containerd[1511]: time="2025-01-13T21:49:53.739792417Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:53.740427 containerd[1511]: time="2025-01-13T21:49:53.740318747Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:49:53.770580 systemd[1]: Started cri-containerd-2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9.scope - libcontainer container 2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9. Jan 13 21:49:53.801293 containerd[1511]: time="2025-01-13T21:49:53.801164132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dwq7q,Uid:ad5321ad-96ea-4705-87ad-19d987280dbc,Namespace:calico-system,Attempt:10,} returns sandbox id \"2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9\"" Jan 13 21:49:53.908612 kubelet[1917]: E0113 21:49:53.908453 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:54.498730 kernel: bpftool[3362]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 13 21:49:54.817722 systemd-networkd[1419]: vxlan.calico: Link UP Jan 13 21:49:54.817734 systemd-networkd[1419]: vxlan.calico: Gained carrier Jan 13 21:49:54.909573 kubelet[1917]: E0113 21:49:54.908740 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:55.248556 systemd-networkd[1419]: cali747f6eee4e5: Gained IPv6LL Jan 13 21:49:55.568703 systemd-networkd[1419]: cali88d89b2188a: Gained IPv6LL Jan 13 21:49:55.910004 kubelet[1917]: E0113 21:49:55.909863 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:56.017554 systemd-networkd[1419]: vxlan.calico: Gained IPv6LL Jan 13 21:49:56.911879 kubelet[1917]: E0113 21:49:56.911518 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:57.533820 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2919238312.mount: Deactivated successfully. Jan 13 21:49:57.912558 kubelet[1917]: E0113 21:49:57.912360 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:58.913299 kubelet[1917]: E0113 21:49:58.913217 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:49:59.342073 containerd[1511]: time="2025-01-13T21:49:59.340619818Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:59.342073 containerd[1511]: time="2025-01-13T21:49:59.341997146Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=71036018" Jan 13 21:49:59.342885 containerd[1511]: time="2025-01-13T21:49:59.342851177Z" level=info msg="ImageCreate event name:\"sha256:29ef6eaebfc53650f3a4609edbf9d35e866f56b2c5e01d32d93439031b300f0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:59.346210 containerd[1511]: time="2025-01-13T21:49:59.346170114Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx@sha256:eca1d1ff18c7af45f86b7e0b572090f563a676ddca3da2ecff678390366335ad\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:49:59.347886 containerd[1511]: time="2025-01-13T21:49:59.347844709Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:29ef6eaebfc53650f3a4609edbf9d35e866f56b2c5e01d32d93439031b300f0b\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:eca1d1ff18c7af45f86b7e0b572090f563a676ddca3da2ecff678390366335ad\", size \"71035896\" in 5.633818377s" Jan 13 21:49:59.347967 containerd[1511]: time="2025-01-13T21:49:59.347888445Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:29ef6eaebfc53650f3a4609edbf9d35e866f56b2c5e01d32d93439031b300f0b\"" Jan 13 21:49:59.350692 containerd[1511]: time="2025-01-13T21:49:59.350004152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 13 21:49:59.352087 containerd[1511]: time="2025-01-13T21:49:59.352048144Z" level=info msg="CreateContainer within sandbox \"536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32\" for container &ContainerMetadata{Name:nginx,Attempt:0,}" Jan 13 21:49:59.370923 containerd[1511]: time="2025-01-13T21:49:59.370868544Z" level=info msg="CreateContainer within sandbox \"536d733b37bfa6ba82842732859861f715065fb10273c83045c95b2d5edcfa32\" for &ContainerMetadata{Name:nginx,Attempt:0,} returns container id \"35922f72f5f45a02f7bd070d0a85fb85b80c509d74a49676ab10942775ea5a50\"" Jan 13 21:49:59.372026 containerd[1511]: time="2025-01-13T21:49:59.371732357Z" level=info msg="StartContainer for \"35922f72f5f45a02f7bd070d0a85fb85b80c509d74a49676ab10942775ea5a50\"" Jan 13 21:49:59.419573 systemd[1]: Started cri-containerd-35922f72f5f45a02f7bd070d0a85fb85b80c509d74a49676ab10942775ea5a50.scope - libcontainer container 35922f72f5f45a02f7bd070d0a85fb85b80c509d74a49676ab10942775ea5a50. Jan 13 21:49:59.461397 containerd[1511]: time="2025-01-13T21:49:59.461302461Z" level=info msg="StartContainer for \"35922f72f5f45a02f7bd070d0a85fb85b80c509d74a49676ab10942775ea5a50\" returns successfully" Jan 13 21:49:59.913466 kubelet[1917]: E0113 21:49:59.913384 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:00.913588 kubelet[1917]: E0113 21:50:00.913500 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:01.125765 containerd[1511]: time="2025-01-13T21:50:01.125696170Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:01.126981 containerd[1511]: time="2025-01-13T21:50:01.126828785Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 13 21:50:01.127871 containerd[1511]: time="2025-01-13T21:50:01.127795279Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:01.130620 containerd[1511]: time="2025-01-13T21:50:01.130552381Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:01.131893 containerd[1511]: time="2025-01-13T21:50:01.131679931Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.781634012s" Jan 13 21:50:01.131893 containerd[1511]: time="2025-01-13T21:50:01.131723651Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 13 21:50:01.134763 containerd[1511]: time="2025-01-13T21:50:01.134515762Z" level=info msg="CreateContainer within sandbox \"2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 13 21:50:01.154883 containerd[1511]: time="2025-01-13T21:50:01.154844162Z" level=info msg="CreateContainer within sandbox \"2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9932ef852dfc106921fdc96425908d01ee3ba2b70d5989bf7e01d88373eba828\"" Jan 13 21:50:01.155922 containerd[1511]: time="2025-01-13T21:50:01.155770225Z" level=info msg="StartContainer for \"9932ef852dfc106921fdc96425908d01ee3ba2b70d5989bf7e01d88373eba828\"" Jan 13 21:50:01.198552 systemd[1]: Started cri-containerd-9932ef852dfc106921fdc96425908d01ee3ba2b70d5989bf7e01d88373eba828.scope - libcontainer container 9932ef852dfc106921fdc96425908d01ee3ba2b70d5989bf7e01d88373eba828. Jan 13 21:50:01.246606 containerd[1511]: time="2025-01-13T21:50:01.246450773Z" level=info msg="StartContainer for \"9932ef852dfc106921fdc96425908d01ee3ba2b70d5989bf7e01d88373eba828\" returns successfully" Jan 13 21:50:01.249351 containerd[1511]: time="2025-01-13T21:50:01.249124138Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 13 21:50:01.914081 kubelet[1917]: E0113 21:50:01.914004 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:02.120471 systemd[1]: run-containerd-runc-k8s.io-eff1cf7bacd02cff43066529bc97e6abeab7398862488ceabb152a3f7346cdad-runc.ffVLuq.mount: Deactivated successfully. Jan 13 21:50:02.219845 kubelet[1917]: I0113 21:50:02.219650 1917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nginx-deployment-8587fbcb89-fwj99" podStartSLOduration=11.583403419 podStartE2EDuration="17.219623303s" podCreationTimestamp="2025-01-13 21:49:45 +0000 UTC" firstStartedPulling="2025-01-13 21:49:53.713675669 +0000 UTC m=+27.409920647" lastFinishedPulling="2025-01-13 21:49:59.349895553 +0000 UTC m=+33.046140531" observedRunningTime="2025-01-13 21:50:00.353278792 +0000 UTC m=+34.049523797" watchObservedRunningTime="2025-01-13 21:50:02.219623303 +0000 UTC m=+35.915868296" Jan 13 21:50:02.914932 kubelet[1917]: E0113 21:50:02.914852 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:03.375234 containerd[1511]: time="2025-01-13T21:50:03.375168524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:03.376574 containerd[1511]: time="2025-01-13T21:50:03.376342918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 13 21:50:03.377443 containerd[1511]: time="2025-01-13T21:50:03.377406632Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:03.380347 containerd[1511]: time="2025-01-13T21:50:03.380257347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:03.381658 containerd[1511]: time="2025-01-13T21:50:03.381483970Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.132313575s" Jan 13 21:50:03.381658 containerd[1511]: time="2025-01-13T21:50:03.381539446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 13 21:50:03.384485 containerd[1511]: time="2025-01-13T21:50:03.384142531Z" level=info msg="CreateContainer within sandbox \"2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 13 21:50:03.410045 containerd[1511]: time="2025-01-13T21:50:03.409996477Z" level=info msg="CreateContainer within sandbox \"2f9fed6db49f5c9c376e3d27b1ddb64084fb13788e6b5c6a48c9d26ce25f06d9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4672f3b01036d2303ba1fc992d75f97ee2e809a34e17e03bb2e2d6b793941f64\"" Jan 13 21:50:03.411058 containerd[1511]: time="2025-01-13T21:50:03.411023666Z" level=info msg="StartContainer for \"4672f3b01036d2303ba1fc992d75f97ee2e809a34e17e03bb2e2d6b793941f64\"" Jan 13 21:50:03.450844 systemd[1]: run-containerd-runc-k8s.io-4672f3b01036d2303ba1fc992d75f97ee2e809a34e17e03bb2e2d6b793941f64-runc.Oj2yfF.mount: Deactivated successfully. Jan 13 21:50:03.461566 systemd[1]: Started cri-containerd-4672f3b01036d2303ba1fc992d75f97ee2e809a34e17e03bb2e2d6b793941f64.scope - libcontainer container 4672f3b01036d2303ba1fc992d75f97ee2e809a34e17e03bb2e2d6b793941f64. Jan 13 21:50:03.504153 containerd[1511]: time="2025-01-13T21:50:03.503865130Z" level=info msg="StartContainer for \"4672f3b01036d2303ba1fc992d75f97ee2e809a34e17e03bb2e2d6b793941f64\" returns successfully" Jan 13 21:50:03.915369 kubelet[1917]: E0113 21:50:03.915258 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:04.040151 kubelet[1917]: I0113 21:50:04.039962 1917 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 13 21:50:04.040151 kubelet[1917]: I0113 21:50:04.040025 1917 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 13 21:50:04.376858 kubelet[1917]: I0113 21:50:04.376428 1917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dwq7q" podStartSLOduration=27.796842075 podStartE2EDuration="37.376410017s" podCreationTimestamp="2025-01-13 21:49:27 +0000 UTC" firstStartedPulling="2025-01-13 21:49:53.8028396 +0000 UTC m=+27.499084577" lastFinishedPulling="2025-01-13 21:50:03.382407547 +0000 UTC m=+37.078652519" observedRunningTime="2025-01-13 21:50:04.376337833 +0000 UTC m=+38.072582817" watchObservedRunningTime="2025-01-13 21:50:04.376410017 +0000 UTC m=+38.072654994" Jan 13 21:50:04.916477 kubelet[1917]: E0113 21:50:04.916373 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:05.916814 kubelet[1917]: E0113 21:50:05.916743 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:06.886604 kubelet[1917]: E0113 21:50:06.886530 1917 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:06.917826 kubelet[1917]: E0113 21:50:06.917710 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:07.902021 systemd[1]: Created slice kubepods-besteffort-pod86042792_0f0c_4b9f_9b25_4b0f0cb1a312.slice - libcontainer container kubepods-besteffort-pod86042792_0f0c_4b9f_9b25_4b0f0cb1a312.slice. Jan 13 21:50:07.918369 kubelet[1917]: E0113 21:50:07.918077 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:08.010030 kubelet[1917]: I0113 21:50:08.009895 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/86042792-0f0c-4b9f-9b25-4b0f0cb1a312-data\") pod \"nfs-server-provisioner-0\" (UID: \"86042792-0f0c-4b9f-9b25-4b0f0cb1a312\") " pod="default/nfs-server-provisioner-0" Jan 13 21:50:08.010295 kubelet[1917]: I0113 21:50:08.010088 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb6c2\" (UniqueName: \"kubernetes.io/projected/86042792-0f0c-4b9f-9b25-4b0f0cb1a312-kube-api-access-mb6c2\") pod \"nfs-server-provisioner-0\" (UID: \"86042792-0f0c-4b9f-9b25-4b0f0cb1a312\") " pod="default/nfs-server-provisioner-0" Jan 13 21:50:08.212400 containerd[1511]: time="2025-01-13T21:50:08.211590335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:86042792-0f0c-4b9f-9b25-4b0f0cb1a312,Namespace:default,Attempt:0,}" Jan 13 21:50:08.422443 systemd-networkd[1419]: cali60e51b789ff: Link UP Jan 13 21:50:08.424100 systemd-networkd[1419]: cali60e51b789ff: Gained carrier Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.292 [INFO][3682] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.230.9.94-k8s-nfs--server--provisioner--0-eth0 nfs-server-provisioner- default 86042792-0f0c-4b9f-9b25-4b0f0cb1a312 1347 0 2025-01-13 21:50:07 +0000 UTC map[app:nfs-server-provisioner apps.kubernetes.io/pod-index:0 chart:nfs-server-provisioner-1.8.0 controller-revision-hash:nfs-server-provisioner-d5cbb7f57 heritage:Helm projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:nfs-server-provisioner release:nfs-server-provisioner statefulset.kubernetes.io/pod-name:nfs-server-provisioner-0] map[] [] [] []} {k8s 10.230.9.94 nfs-server-provisioner-0 eth0 nfs-server-provisioner [] [] [kns.default ksa.default.nfs-server-provisioner] cali60e51b789ff [{nfs TCP 2049 0 } {nfs-udp UDP 2049 0 } {nlockmgr TCP 32803 0 } {nlockmgr-udp UDP 32803 0 } {mountd TCP 20048 0 } {mountd-udp UDP 20048 0 } {rquotad TCP 875 0 } {rquotad-udp UDP 875 0 } {rpcbind TCP 111 0 } {rpcbind-udp UDP 111 0 } {statd TCP 662 0 } {statd-udp UDP 662 0 }] []}} ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.292 [INFO][3682] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.343 [INFO][3694] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" HandleID="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Workload="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.358 [INFO][3694] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" HandleID="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Workload="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318a20), Attrs:map[string]string{"namespace":"default", "node":"10.230.9.94", "pod":"nfs-server-provisioner-0", "timestamp":"2025-01-13 21:50:08.343501025 +0000 UTC"}, Hostname:"10.230.9.94", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.359 [INFO][3694] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.359 [INFO][3694] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.359 [INFO][3694] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.230.9.94' Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.364 [INFO][3694] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.372 [INFO][3694] ipam/ipam.go 372: Looking up existing affinities for host host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.379 [INFO][3694] ipam/ipam.go 489: Trying affinity for 192.168.49.64/26 host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.381 [INFO][3694] ipam/ipam.go 155: Attempting to load block cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.385 [INFO][3694] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.385 [INFO][3694] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.49.64/26 handle="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.388 [INFO][3694] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24 Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.395 [INFO][3694] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.49.64/26 handle="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.414 [INFO][3694] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.49.67/26] block=192.168.49.64/26 handle="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.414 [INFO][3694] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.49.67/26] handle="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" host="10.230.9.94" Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.414 [INFO][3694] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:50:08.443912 containerd[1511]: 2025-01-13 21:50:08.414 [INFO][3694] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.49.67/26] IPv6=[] ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" HandleID="k8s-pod-network.ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Workload="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" Jan 13 21:50:08.445115 containerd[1511]: 2025-01-13 21:50:08.416 [INFO][3682] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"86042792-0f0c-4b9f-9b25-4b0f0cb1a312", ResourceVersion:"1347", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 50, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.49.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:50:08.445115 containerd[1511]: 2025-01-13 21:50:08.417 [INFO][3682] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.49.67/32] ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" Jan 13 21:50:08.445115 containerd[1511]: 2025-01-13 21:50:08.417 [INFO][3682] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60e51b789ff ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" Jan 13 21:50:08.445115 containerd[1511]: 2025-01-13 21:50:08.424 [INFO][3682] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" Jan 13 21:50:08.445437 containerd[1511]: 2025-01-13 21:50:08.426 [INFO][3682] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"86042792-0f0c-4b9f-9b25-4b0f0cb1a312", ResourceVersion:"1347", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 50, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.49.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"72:2b:07:d6:ca:2d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:50:08.445437 containerd[1511]: 2025-01-13 21:50:08.441 [INFO][3682] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.230.9.94-k8s-nfs--server--provisioner--0-eth0" Jan 13 21:50:08.500064 containerd[1511]: time="2025-01-13T21:50:08.499574908Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:50:08.500064 containerd[1511]: time="2025-01-13T21:50:08.499707812Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:50:08.500064 containerd[1511]: time="2025-01-13T21:50:08.499734204Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:50:08.500064 containerd[1511]: time="2025-01-13T21:50:08.499895475Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:50:08.538632 systemd[1]: Started cri-containerd-ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24.scope - libcontainer container ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24. Jan 13 21:50:08.599853 containerd[1511]: time="2025-01-13T21:50:08.599779575Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:86042792-0f0c-4b9f-9b25-4b0f0cb1a312,Namespace:default,Attempt:0,} returns sandbox id \"ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24\"" Jan 13 21:50:08.603637 containerd[1511]: time="2025-01-13T21:50:08.602603186Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\"" Jan 13 21:50:08.919254 kubelet[1917]: E0113 21:50:08.919060 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:09.127473 systemd[1]: run-containerd-runc-k8s.io-ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24-runc.TlN59E.mount: Deactivated successfully. Jan 13 21:50:09.919356 kubelet[1917]: E0113 21:50:09.919254 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:10.416973 systemd-networkd[1419]: cali60e51b789ff: Gained IPv6LL Jan 13 21:50:10.919655 kubelet[1917]: E0113 21:50:10.919580 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:11.921429 kubelet[1917]: E0113 21:50:11.920846 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:12.574867 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1137984434.mount: Deactivated successfully. Jan 13 21:50:12.922402 kubelet[1917]: E0113 21:50:12.921997 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:13.922745 kubelet[1917]: E0113 21:50:13.922631 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:14.923936 kubelet[1917]: E0113 21:50:14.923647 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:15.556639 containerd[1511]: time="2025-01-13T21:50:15.556575180Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:15.558052 containerd[1511]: time="2025-01-13T21:50:15.558007047Z" level=info msg="stop pulling image registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8: active requests=0, bytes read=91039414" Jan 13 21:50:15.558943 containerd[1511]: time="2025-01-13T21:50:15.558867188Z" level=info msg="ImageCreate event name:\"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:15.562666 containerd[1511]: time="2025-01-13T21:50:15.562607101Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:15.565209 containerd[1511]: time="2025-01-13T21:50:15.564169416Z" level=info msg="Pulled image \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" with image id \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\", repo tag \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\", repo digest \"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\", size \"91036984\" in 6.96152822s" Jan 13 21:50:15.565209 containerd[1511]: time="2025-01-13T21:50:15.564217056Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" returns image reference \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\"" Jan 13 21:50:15.568019 containerd[1511]: time="2025-01-13T21:50:15.567964072Z" level=info msg="CreateContainer within sandbox \"ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24\" for container &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,}" Jan 13 21:50:15.598725 containerd[1511]: time="2025-01-13T21:50:15.598572105Z" level=info msg="CreateContainer within sandbox \"ad7cca150c0d70281eb561101fe3dda4b11e551f78b09899538a47635dfc9f24\" for &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,} returns container id \"c3777c5599d828a9db8cd730bfad748f84f6493651f3d9bf8c8bf6cbeae994ff\"" Jan 13 21:50:15.599528 containerd[1511]: time="2025-01-13T21:50:15.599198364Z" level=info msg="StartContainer for \"c3777c5599d828a9db8cd730bfad748f84f6493651f3d9bf8c8bf6cbeae994ff\"" Jan 13 21:50:15.641605 systemd[1]: Started cri-containerd-c3777c5599d828a9db8cd730bfad748f84f6493651f3d9bf8c8bf6cbeae994ff.scope - libcontainer container c3777c5599d828a9db8cd730bfad748f84f6493651f3d9bf8c8bf6cbeae994ff. Jan 13 21:50:15.683168 containerd[1511]: time="2025-01-13T21:50:15.682354066Z" level=info msg="StartContainer for \"c3777c5599d828a9db8cd730bfad748f84f6493651f3d9bf8c8bf6cbeae994ff\" returns successfully" Jan 13 21:50:15.924395 kubelet[1917]: E0113 21:50:15.924157 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:16.415625 kubelet[1917]: I0113 21:50:16.415517 1917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nfs-server-provisioner-0" podStartSLOduration=2.451577985 podStartE2EDuration="9.415496215s" podCreationTimestamp="2025-01-13 21:50:07 +0000 UTC" firstStartedPulling="2025-01-13 21:50:08.602047263 +0000 UTC m=+42.298292248" lastFinishedPulling="2025-01-13 21:50:15.565965495 +0000 UTC m=+49.262210478" observedRunningTime="2025-01-13 21:50:16.415043728 +0000 UTC m=+50.111288713" watchObservedRunningTime="2025-01-13 21:50:16.415496215 +0000 UTC m=+50.111741201" Jan 13 21:50:16.925670 kubelet[1917]: E0113 21:50:16.925593 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:17.926273 kubelet[1917]: E0113 21:50:17.926193 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:18.927217 kubelet[1917]: E0113 21:50:18.927136 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:19.928310 kubelet[1917]: E0113 21:50:19.928217 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:20.929520 kubelet[1917]: E0113 21:50:20.929454 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:21.930359 kubelet[1917]: E0113 21:50:21.930217 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:22.931061 kubelet[1917]: E0113 21:50:22.930965 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:23.931480 kubelet[1917]: E0113 21:50:23.931408 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:24.932125 kubelet[1917]: E0113 21:50:24.932059 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:25.507344 systemd[1]: Created slice kubepods-besteffort-podd16c38dc_e4ef_435b_a5dd_0bb8f7483d86.slice - libcontainer container kubepods-besteffort-podd16c38dc_e4ef_435b_a5dd_0bb8f7483d86.slice. Jan 13 21:50:25.520487 kubelet[1917]: I0113 21:50:25.520136 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm7mg\" (UniqueName: \"kubernetes.io/projected/d16c38dc-e4ef-435b-a5dd-0bb8f7483d86-kube-api-access-vm7mg\") pod \"test-pod-1\" (UID: \"d16c38dc-e4ef-435b-a5dd-0bb8f7483d86\") " pod="default/test-pod-1" Jan 13 21:50:25.520487 kubelet[1917]: I0113 21:50:25.520193 1917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-03cffeb4-6d00-41b6-8b40-54cd43985cf3\" (UniqueName: \"kubernetes.io/nfs/d16c38dc-e4ef-435b-a5dd-0bb8f7483d86-pvc-03cffeb4-6d00-41b6-8b40-54cd43985cf3\") pod \"test-pod-1\" (UID: \"d16c38dc-e4ef-435b-a5dd-0bb8f7483d86\") " pod="default/test-pod-1" Jan 13 21:50:25.672129 kernel: FS-Cache: Loaded Jan 13 21:50:25.748867 kernel: RPC: Registered named UNIX socket transport module. Jan 13 21:50:25.749126 kernel: RPC: Registered udp transport module. Jan 13 21:50:25.749168 kernel: RPC: Registered tcp transport module. Jan 13 21:50:25.750674 kernel: RPC: Registered tcp-with-tls transport module. Jan 13 21:50:25.751939 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Jan 13 21:50:25.933349 kubelet[1917]: E0113 21:50:25.933077 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:26.029496 kernel: NFS: Registering the id_resolver key type Jan 13 21:50:26.029717 kernel: Key type id_resolver registered Jan 13 21:50:26.029782 kernel: Key type id_legacy registered Jan 13 21:50:26.089057 nfsidmap[3877]: nss_getpwnam: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'gb1.brightbox.com' Jan 13 21:50:26.097475 nfsidmap[3880]: nss_name_to_gid: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain 'gb1.brightbox.com' Jan 13 21:50:26.111860 containerd[1511]: time="2025-01-13T21:50:26.111726625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:d16c38dc-e4ef-435b-a5dd-0bb8f7483d86,Namespace:default,Attempt:0,}" Jan 13 21:50:26.302784 systemd-networkd[1419]: cali5ec59c6bf6e: Link UP Jan 13 21:50:26.304659 systemd-networkd[1419]: cali5ec59c6bf6e: Gained carrier Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.186 [INFO][3884] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.230.9.94-k8s-test--pod--1-eth0 default d16c38dc-e4ef-435b-a5dd-0bb8f7483d86 1409 0 2025-01-13 21:50:10 +0000 UTC map[projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 10.230.9.94 test-pod-1 eth0 default [] [] [kns.default ksa.default.default] cali5ec59c6bf6e [] []}} ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.187 [INFO][3884] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-eth0" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.235 [INFO][3894] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" HandleID="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Workload="10.230.9.94-k8s-test--pod--1-eth0" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.250 [INFO][3894] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" HandleID="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Workload="10.230.9.94-k8s-test--pod--1-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319800), Attrs:map[string]string{"namespace":"default", "node":"10.230.9.94", "pod":"test-pod-1", "timestamp":"2025-01-13 21:50:26.234981395 +0000 UTC"}, Hostname:"10.230.9.94", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.251 [INFO][3894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.251 [INFO][3894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.251 [INFO][3894] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.230.9.94' Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.254 [INFO][3894] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.261 [INFO][3894] ipam/ipam.go 372: Looking up existing affinities for host host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.268 [INFO][3894] ipam/ipam.go 489: Trying affinity for 192.168.49.64/26 host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.271 [INFO][3894] ipam/ipam.go 155: Attempting to load block cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.274 [INFO][3894] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.49.64/26 host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.275 [INFO][3894] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.49.64/26 handle="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.278 [INFO][3894] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167 Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.285 [INFO][3894] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.49.64/26 handle="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.293 [INFO][3894] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.49.68/26] block=192.168.49.64/26 handle="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.293 [INFO][3894] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.49.68/26] handle="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" host="10.230.9.94" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.293 [INFO][3894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.293 [INFO][3894] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.49.68/26] IPv6=[] ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" HandleID="k8s-pod-network.522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Workload="10.230.9.94-k8s-test--pod--1-eth0" Jan 13 21:50:26.326252 containerd[1511]: 2025-01-13 21:50:26.296 [INFO][3884] cni-plugin/k8s.go 386: Populated endpoint ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"d16c38dc-e4ef-435b-a5dd-0bb8f7483d86", ResourceVersion:"1409", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.49.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:50:26.330174 containerd[1511]: 2025-01-13 21:50:26.296 [INFO][3884] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.49.68/32] ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-eth0" Jan 13 21:50:26.330174 containerd[1511]: 2025-01-13 21:50:26.296 [INFO][3884] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ec59c6bf6e ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-eth0" Jan 13 21:50:26.330174 containerd[1511]: 2025-01-13 21:50:26.305 [INFO][3884] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-eth0" Jan 13 21:50:26.330174 containerd[1511]: 2025-01-13 21:50:26.306 [INFO][3884] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.230.9.94-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"d16c38dc-e4ef-435b-a5dd-0bb8f7483d86", ResourceVersion:"1409", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.230.9.94", ContainerID:"522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.49.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"b6:49:6a:40:d7:82", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:50:26.330174 containerd[1511]: 2025-01-13 21:50:26.323 [INFO][3884] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.230.9.94-k8s-test--pod--1-eth0" Jan 13 21:50:26.374510 containerd[1511]: time="2025-01-13T21:50:26.373810479Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:50:26.374510 containerd[1511]: time="2025-01-13T21:50:26.374082993Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:50:26.374510 containerd[1511]: time="2025-01-13T21:50:26.374117018Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:50:26.375739 containerd[1511]: time="2025-01-13T21:50:26.375645730Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:50:26.405658 systemd[1]: Started cri-containerd-522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167.scope - libcontainer container 522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167. Jan 13 21:50:26.470515 containerd[1511]: time="2025-01-13T21:50:26.470419973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:d16c38dc-e4ef-435b-a5dd-0bb8f7483d86,Namespace:default,Attempt:0,} returns sandbox id \"522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167\"" Jan 13 21:50:26.473758 containerd[1511]: time="2025-01-13T21:50:26.473721641Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Jan 13 21:50:26.808609 containerd[1511]: time="2025-01-13T21:50:26.808389136Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:50:26.809284 containerd[1511]: time="2025-01-13T21:50:26.809227700Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=61" Jan 13 21:50:26.815549 containerd[1511]: time="2025-01-13T21:50:26.815421231Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:29ef6eaebfc53650f3a4609edbf9d35e866f56b2c5e01d32d93439031b300f0b\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:eca1d1ff18c7af45f86b7e0b572090f563a676ddca3da2ecff678390366335ad\", size \"71035896\" in 341.644702ms" Jan 13 21:50:26.815549 containerd[1511]: time="2025-01-13T21:50:26.815467046Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:29ef6eaebfc53650f3a4609edbf9d35e866f56b2c5e01d32d93439031b300f0b\"" Jan 13 21:50:26.818577 containerd[1511]: time="2025-01-13T21:50:26.818513013Z" level=info msg="CreateContainer within sandbox \"522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167\" for container &ContainerMetadata{Name:test,Attempt:0,}" Jan 13 21:50:26.836355 containerd[1511]: time="2025-01-13T21:50:26.836283659Z" level=info msg="CreateContainer within sandbox \"522bf8f5ddd795517ca0f7ee4f06c5fbf41be64413b16db3badc19e200ba5167\" for &ContainerMetadata{Name:test,Attempt:0,} returns container id \"e271ed6b991c94969633993bcb3e26402f2156efe89dbc0cf213737594df8920\"" Jan 13 21:50:26.841396 containerd[1511]: time="2025-01-13T21:50:26.840609375Z" level=info msg="StartContainer for \"e271ed6b991c94969633993bcb3e26402f2156efe89dbc0cf213737594df8920\"" Jan 13 21:50:26.852046 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1517848151.mount: Deactivated successfully. Jan 13 21:50:26.886530 kubelet[1917]: E0113 21:50:26.886471 1917 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:26.887744 systemd[1]: Started cri-containerd-e271ed6b991c94969633993bcb3e26402f2156efe89dbc0cf213737594df8920.scope - libcontainer container e271ed6b991c94969633993bcb3e26402f2156efe89dbc0cf213737594df8920. Jan 13 21:50:26.919803 containerd[1511]: time="2025-01-13T21:50:26.919490148Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:50:26.919803 containerd[1511]: time="2025-01-13T21:50:26.919738263Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:50:26.919803 containerd[1511]: time="2025-01-13T21:50:26.919758182Z" level=info msg="StopPodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:50:26.924582 containerd[1511]: time="2025-01-13T21:50:26.924363585Z" level=info msg="RemovePodSandbox for \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:50:26.933252 kubelet[1917]: E0113 21:50:26.933194 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:26.934369 containerd[1511]: time="2025-01-13T21:50:26.933369659Z" level=info msg="StartContainer for \"e271ed6b991c94969633993bcb3e26402f2156efe89dbc0cf213737594df8920\" returns successfully" Jan 13 21:50:26.938881 containerd[1511]: time="2025-01-13T21:50:26.938848097Z" level=info msg="Forcibly stopping sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\"" Jan 13 21:50:26.958388 containerd[1511]: time="2025-01-13T21:50:26.939088571Z" level=info msg="TearDown network for sandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" successfully" Jan 13 21:50:26.977818 containerd[1511]: time="2025-01-13T21:50:26.977633214Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:26.977818 containerd[1511]: time="2025-01-13T21:50:26.977737012Z" level=info msg="RemovePodSandbox \"0220dbc0ed794cc687f7f26b0545e215f7498a651aba9d309d5ab5211d43e0cf\" returns successfully" Jan 13 21:50:26.979081 containerd[1511]: time="2025-01-13T21:50:26.979041964Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:50:26.979233 containerd[1511]: time="2025-01-13T21:50:26.979169126Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:50:26.979233 containerd[1511]: time="2025-01-13T21:50:26.979194619Z" level=info msg="StopPodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:50:26.981372 containerd[1511]: time="2025-01-13T21:50:26.980757047Z" level=info msg="RemovePodSandbox for \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:50:26.981372 containerd[1511]: time="2025-01-13T21:50:26.980818775Z" level=info msg="Forcibly stopping sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\"" Jan 13 21:50:26.981372 containerd[1511]: time="2025-01-13T21:50:26.980914827Z" level=info msg="TearDown network for sandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" successfully" Jan 13 21:50:26.995413 containerd[1511]: time="2025-01-13T21:50:26.995379002Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:26.995587 containerd[1511]: time="2025-01-13T21:50:26.995558332Z" level=info msg="RemovePodSandbox \"5de0aa22754d9251e8f39fb28d67dd2ef7ddeac17b8d431688dc07cb15fb851f\" returns successfully" Jan 13 21:50:26.996137 containerd[1511]: time="2025-01-13T21:50:26.996107108Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:50:26.996608 containerd[1511]: time="2025-01-13T21:50:26.996506326Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:50:26.996608 containerd[1511]: time="2025-01-13T21:50:26.996532288Z" level=info msg="StopPodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:50:26.998059 containerd[1511]: time="2025-01-13T21:50:26.996925872Z" level=info msg="RemovePodSandbox for \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:50:26.998059 containerd[1511]: time="2025-01-13T21:50:26.996962600Z" level=info msg="Forcibly stopping sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\"" Jan 13 21:50:26.998059 containerd[1511]: time="2025-01-13T21:50:26.997067649Z" level=info msg="TearDown network for sandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" successfully" Jan 13 21:50:27.000073 containerd[1511]: time="2025-01-13T21:50:27.000037486Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.000269 containerd[1511]: time="2025-01-13T21:50:27.000238716Z" level=info msg="RemovePodSandbox \"3a17c87217af6e13dffeb9c9e5bdbd63b3fd97d6a0c345299d82250698618856\" returns successfully" Jan 13 21:50:27.000772 containerd[1511]: time="2025-01-13T21:50:27.000734494Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:50:27.000975 containerd[1511]: time="2025-01-13T21:50:27.000938324Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:50:27.001095 containerd[1511]: time="2025-01-13T21:50:27.001067717Z" level=info msg="StopPodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:50:27.001540 containerd[1511]: time="2025-01-13T21:50:27.001511290Z" level=info msg="RemovePodSandbox for \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:50:27.001997 containerd[1511]: time="2025-01-13T21:50:27.001708022Z" level=info msg="Forcibly stopping sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\"" Jan 13 21:50:27.001997 containerd[1511]: time="2025-01-13T21:50:27.001873021Z" level=info msg="TearDown network for sandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" successfully" Jan 13 21:50:27.005357 containerd[1511]: time="2025-01-13T21:50:27.005141433Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.005357 containerd[1511]: time="2025-01-13T21:50:27.005219065Z" level=info msg="RemovePodSandbox \"9625825814a72075a378d03d01829d0da3ba59866bdabce330f14c9e2214ce98\" returns successfully" Jan 13 21:50:27.006071 containerd[1511]: time="2025-01-13T21:50:27.005838988Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:50:27.006071 containerd[1511]: time="2025-01-13T21:50:27.005947221Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:50:27.006071 containerd[1511]: time="2025-01-13T21:50:27.005989798Z" level=info msg="StopPodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:50:27.007368 containerd[1511]: time="2025-01-13T21:50:27.006443712Z" level=info msg="RemovePodSandbox for \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:50:27.007368 containerd[1511]: time="2025-01-13T21:50:27.006479726Z" level=info msg="Forcibly stopping sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\"" Jan 13 21:50:27.007368 containerd[1511]: time="2025-01-13T21:50:27.006602336Z" level=info msg="TearDown network for sandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" successfully" Jan 13 21:50:27.009545 containerd[1511]: time="2025-01-13T21:50:27.009513770Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.009723 containerd[1511]: time="2025-01-13T21:50:27.009695362Z" level=info msg="RemovePodSandbox \"74dfc73d8beb750288dc59fa11eb6bf011a7554047e5cc9be7df1b1c339bd906\" returns successfully" Jan 13 21:50:27.010385 containerd[1511]: time="2025-01-13T21:50:27.010357561Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:50:27.010663 containerd[1511]: time="2025-01-13T21:50:27.010637860Z" level=info msg="TearDown network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" successfully" Jan 13 21:50:27.010880 containerd[1511]: time="2025-01-13T21:50:27.010745608Z" level=info msg="StopPodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" returns successfully" Jan 13 21:50:27.011786 containerd[1511]: time="2025-01-13T21:50:27.011163988Z" level=info msg="RemovePodSandbox for \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:50:27.011786 containerd[1511]: time="2025-01-13T21:50:27.011217411Z" level=info msg="Forcibly stopping sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\"" Jan 13 21:50:27.011786 containerd[1511]: time="2025-01-13T21:50:27.011353590Z" level=info msg="TearDown network for sandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" successfully" Jan 13 21:50:27.014188 containerd[1511]: time="2025-01-13T21:50:27.014149044Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.014416 containerd[1511]: time="2025-01-13T21:50:27.014386686Z" level=info msg="RemovePodSandbox \"5c63e8b5ca0574f6a4eb65c3a21ead99063982a7dfaf4b80c696c68980677061\" returns successfully" Jan 13 21:50:27.014950 containerd[1511]: time="2025-01-13T21:50:27.014921941Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" Jan 13 21:50:27.015296 containerd[1511]: time="2025-01-13T21:50:27.015269534Z" level=info msg="TearDown network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" successfully" Jan 13 21:50:27.015464 containerd[1511]: time="2025-01-13T21:50:27.015421701Z" level=info msg="StopPodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" returns successfully" Jan 13 21:50:27.017023 containerd[1511]: time="2025-01-13T21:50:27.015844486Z" level=info msg="RemovePodSandbox for \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" Jan 13 21:50:27.017023 containerd[1511]: time="2025-01-13T21:50:27.015878243Z" level=info msg="Forcibly stopping sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\"" Jan 13 21:50:27.017023 containerd[1511]: time="2025-01-13T21:50:27.016017108Z" level=info msg="TearDown network for sandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" successfully" Jan 13 21:50:27.018857 containerd[1511]: time="2025-01-13T21:50:27.018793353Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.019002 containerd[1511]: time="2025-01-13T21:50:27.018954973Z" level=info msg="RemovePodSandbox \"8e7ea4c75ab4d4a80312f6b94cc417c8e29b23604bfddac2b1cd4b4c16a50d59\" returns successfully" Jan 13 21:50:27.019514 containerd[1511]: time="2025-01-13T21:50:27.019485938Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\"" Jan 13 21:50:27.019716 containerd[1511]: time="2025-01-13T21:50:27.019689065Z" level=info msg="TearDown network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" successfully" Jan 13 21:50:27.019822 containerd[1511]: time="2025-01-13T21:50:27.019798887Z" level=info msg="StopPodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" returns successfully" Jan 13 21:50:27.020392 containerd[1511]: time="2025-01-13T21:50:27.020332818Z" level=info msg="RemovePodSandbox for \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\"" Jan 13 21:50:27.020640 containerd[1511]: time="2025-01-13T21:50:27.020601397Z" level=info msg="Forcibly stopping sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\"" Jan 13 21:50:27.021360 containerd[1511]: time="2025-01-13T21:50:27.020833762Z" level=info msg="TearDown network for sandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" successfully" Jan 13 21:50:27.023615 containerd[1511]: time="2025-01-13T21:50:27.023582866Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.023764 containerd[1511]: time="2025-01-13T21:50:27.023736940Z" level=info msg="RemovePodSandbox \"39ad561ba25a73f586609de8dfe47643bf81c0742fa0139011ba7f8c1abd82b0\" returns successfully" Jan 13 21:50:27.024403 containerd[1511]: time="2025-01-13T21:50:27.024358842Z" level=info msg="StopPodSandbox for \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\"" Jan 13 21:50:27.024577 containerd[1511]: time="2025-01-13T21:50:27.024547982Z" level=info msg="TearDown network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" successfully" Jan 13 21:50:27.024577 containerd[1511]: time="2025-01-13T21:50:27.024572220Z" level=info msg="StopPodSandbox for \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" returns successfully" Jan 13 21:50:27.024975 containerd[1511]: time="2025-01-13T21:50:27.024923273Z" level=info msg="RemovePodSandbox for \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\"" Jan 13 21:50:27.025063 containerd[1511]: time="2025-01-13T21:50:27.024981129Z" level=info msg="Forcibly stopping sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\"" Jan 13 21:50:27.025358 containerd[1511]: time="2025-01-13T21:50:27.025087933Z" level=info msg="TearDown network for sandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" successfully" Jan 13 21:50:27.038555 containerd[1511]: time="2025-01-13T21:50:27.038399897Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.038555 containerd[1511]: time="2025-01-13T21:50:27.038452865Z" level=info msg="RemovePodSandbox \"710104ff3a563130904196d5e3c75c1f6ff0e2f0ec8bd800139a078b83138b42\" returns successfully" Jan 13 21:50:27.040366 containerd[1511]: time="2025-01-13T21:50:27.039543443Z" level=info msg="StopPodSandbox for \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\"" Jan 13 21:50:27.042943 containerd[1511]: time="2025-01-13T21:50:27.042882676Z" level=info msg="TearDown network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\" successfully" Jan 13 21:50:27.043366 containerd[1511]: time="2025-01-13T21:50:27.043061495Z" level=info msg="StopPodSandbox for \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\" returns successfully" Jan 13 21:50:27.043565 containerd[1511]: time="2025-01-13T21:50:27.043493961Z" level=info msg="RemovePodSandbox for \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\"" Jan 13 21:50:27.043633 containerd[1511]: time="2025-01-13T21:50:27.043590335Z" level=info msg="Forcibly stopping sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\"" Jan 13 21:50:27.043819 containerd[1511]: time="2025-01-13T21:50:27.043751829Z" level=info msg="TearDown network for sandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\" successfully" Jan 13 21:50:27.049387 containerd[1511]: time="2025-01-13T21:50:27.049352909Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.049644 containerd[1511]: time="2025-01-13T21:50:27.049503350Z" level=info msg="RemovePodSandbox \"f00dc6d2371701a063fc959624fd0f449c4a1e1e3c768d9359e2eba1a3b4b2ae\" returns successfully" Jan 13 21:50:27.050355 containerd[1511]: time="2025-01-13T21:50:27.050122194Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:50:27.050355 containerd[1511]: time="2025-01-13T21:50:27.050234675Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:50:27.050355 containerd[1511]: time="2025-01-13T21:50:27.050282063Z" level=info msg="StopPodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:50:27.051299 containerd[1511]: time="2025-01-13T21:50:27.051161857Z" level=info msg="RemovePodSandbox for \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:50:27.051299 containerd[1511]: time="2025-01-13T21:50:27.051216331Z" level=info msg="Forcibly stopping sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\"" Jan 13 21:50:27.051763 containerd[1511]: time="2025-01-13T21:50:27.051620514Z" level=info msg="TearDown network for sandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" successfully" Jan 13 21:50:27.054369 containerd[1511]: time="2025-01-13T21:50:27.054292481Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.054369 containerd[1511]: time="2025-01-13T21:50:27.054358359Z" level=info msg="RemovePodSandbox \"4a09c052950edad2e2c5643650e244b45c36c9c8882d28e49cf91997890c77e7\" returns successfully" Jan 13 21:50:27.055114 containerd[1511]: time="2025-01-13T21:50:27.055083905Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:50:27.055513 containerd[1511]: time="2025-01-13T21:50:27.055372712Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:50:27.055513 containerd[1511]: time="2025-01-13T21:50:27.055398662Z" level=info msg="StopPodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:50:27.056086 containerd[1511]: time="2025-01-13T21:50:27.055814148Z" level=info msg="RemovePodSandbox for \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:50:27.056086 containerd[1511]: time="2025-01-13T21:50:27.055902240Z" level=info msg="Forcibly stopping sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\"" Jan 13 21:50:27.057230 containerd[1511]: time="2025-01-13T21:50:27.056065474Z" level=info msg="TearDown network for sandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" successfully" Jan 13 21:50:27.060495 containerd[1511]: time="2025-01-13T21:50:27.058954560Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.060495 containerd[1511]: time="2025-01-13T21:50:27.059656280Z" level=info msg="RemovePodSandbox \"81836d84867a131fc1c7efa8ff755c209ef369152b69cc03f70e0da6b32c8b3d\" returns successfully" Jan 13 21:50:27.061269 containerd[1511]: time="2025-01-13T21:50:27.061167647Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:50:27.062061 containerd[1511]: time="2025-01-13T21:50:27.061691445Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:50:27.062061 containerd[1511]: time="2025-01-13T21:50:27.061794555Z" level=info msg="StopPodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:50:27.063376 containerd[1511]: time="2025-01-13T21:50:27.062614980Z" level=info msg="RemovePodSandbox for \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:50:27.063376 containerd[1511]: time="2025-01-13T21:50:27.062662456Z" level=info msg="Forcibly stopping sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\"" Jan 13 21:50:27.063376 containerd[1511]: time="2025-01-13T21:50:27.062751958Z" level=info msg="TearDown network for sandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" successfully" Jan 13 21:50:27.065426 containerd[1511]: time="2025-01-13T21:50:27.065389963Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.065529 containerd[1511]: time="2025-01-13T21:50:27.065437452Z" level=info msg="RemovePodSandbox \"4c2a1d187d38d4ff5a3aef151798bc77aac8fa3f171bd3edbeabfbe78e11417a\" returns successfully" Jan 13 21:50:27.066468 containerd[1511]: time="2025-01-13T21:50:27.066208458Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:50:27.066468 containerd[1511]: time="2025-01-13T21:50:27.066369624Z" level=info msg="TearDown network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" successfully" Jan 13 21:50:27.066468 containerd[1511]: time="2025-01-13T21:50:27.066402360Z" level=info msg="StopPodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" returns successfully" Jan 13 21:50:27.067036 containerd[1511]: time="2025-01-13T21:50:27.066897687Z" level=info msg="RemovePodSandbox for \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:50:27.067233 containerd[1511]: time="2025-01-13T21:50:27.066927847Z" level=info msg="Forcibly stopping sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\"" Jan 13 21:50:27.067673 containerd[1511]: time="2025-01-13T21:50:27.067470571Z" level=info msg="TearDown network for sandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" successfully" Jan 13 21:50:27.070380 containerd[1511]: time="2025-01-13T21:50:27.070317917Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.070467 containerd[1511]: time="2025-01-13T21:50:27.070383219Z" level=info msg="RemovePodSandbox \"48b7a30776c807fa8df9b20b5cd8450cce11f9fa2089f6184e790bdfc7d87daf\" returns successfully" Jan 13 21:50:27.070865 containerd[1511]: time="2025-01-13T21:50:27.070818492Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" Jan 13 21:50:27.070999 containerd[1511]: time="2025-01-13T21:50:27.070933912Z" level=info msg="TearDown network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" successfully" Jan 13 21:50:27.071063 containerd[1511]: time="2025-01-13T21:50:27.070995454Z" level=info msg="StopPodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" returns successfully" Jan 13 21:50:27.071867 containerd[1511]: time="2025-01-13T21:50:27.071414309Z" level=info msg="RemovePodSandbox for \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" Jan 13 21:50:27.071867 containerd[1511]: time="2025-01-13T21:50:27.071450765Z" level=info msg="Forcibly stopping sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\"" Jan 13 21:50:27.071867 containerd[1511]: time="2025-01-13T21:50:27.071580551Z" level=info msg="TearDown network for sandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" successfully" Jan 13 21:50:27.073985 containerd[1511]: time="2025-01-13T21:50:27.073922235Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.073985 containerd[1511]: time="2025-01-13T21:50:27.073981003Z" level=info msg="RemovePodSandbox \"ea381e365d5f1978c66e904c56793269bb7b4c9bb79bb6b3b3dcb0d2945e0d7e\" returns successfully" Jan 13 21:50:27.074707 containerd[1511]: time="2025-01-13T21:50:27.074482974Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\"" Jan 13 21:50:27.074707 containerd[1511]: time="2025-01-13T21:50:27.074611693Z" level=info msg="TearDown network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" successfully" Jan 13 21:50:27.074707 containerd[1511]: time="2025-01-13T21:50:27.074630758Z" level=info msg="StopPodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" returns successfully" Jan 13 21:50:27.075374 containerd[1511]: time="2025-01-13T21:50:27.075255162Z" level=info msg="RemovePodSandbox for \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\"" Jan 13 21:50:27.075374 containerd[1511]: time="2025-01-13T21:50:27.075290165Z" level=info msg="Forcibly stopping sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\"" Jan 13 21:50:27.075565 containerd[1511]: time="2025-01-13T21:50:27.075409996Z" level=info msg="TearDown network for sandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" successfully" Jan 13 21:50:27.077831 containerd[1511]: time="2025-01-13T21:50:27.077790510Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.078052 containerd[1511]: time="2025-01-13T21:50:27.077839989Z" level=info msg="RemovePodSandbox \"9a6647295b3ccda984ef5d2ad9e28e5bac305d785f11732bad982b7dfb497ee8\" returns successfully" Jan 13 21:50:27.078622 containerd[1511]: time="2025-01-13T21:50:27.078570355Z" level=info msg="StopPodSandbox for \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\"" Jan 13 21:50:27.078956 containerd[1511]: time="2025-01-13T21:50:27.078706241Z" level=info msg="TearDown network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" successfully" Jan 13 21:50:27.078956 containerd[1511]: time="2025-01-13T21:50:27.078731070Z" level=info msg="StopPodSandbox for \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" returns successfully" Jan 13 21:50:27.079371 containerd[1511]: time="2025-01-13T21:50:27.079213884Z" level=info msg="RemovePodSandbox for \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\"" Jan 13 21:50:27.079371 containerd[1511]: time="2025-01-13T21:50:27.079249251Z" level=info msg="Forcibly stopping sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\"" Jan 13 21:50:27.080034 containerd[1511]: time="2025-01-13T21:50:27.079644561Z" level=info msg="TearDown network for sandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" successfully" Jan 13 21:50:27.083171 containerd[1511]: time="2025-01-13T21:50:27.082819090Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.083171 containerd[1511]: time="2025-01-13T21:50:27.082899691Z" level=info msg="RemovePodSandbox \"db5a431c9bd40bbb7b39170e6a1118f49322bb12f970d23197bc1264b831bcfa\" returns successfully" Jan 13 21:50:27.083387 containerd[1511]: time="2025-01-13T21:50:27.083351344Z" level=info msg="StopPodSandbox for \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\"" Jan 13 21:50:27.083490 containerd[1511]: time="2025-01-13T21:50:27.083460684Z" level=info msg="TearDown network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\" successfully" Jan 13 21:50:27.083490 containerd[1511]: time="2025-01-13T21:50:27.083485587Z" level=info msg="StopPodSandbox for \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\" returns successfully" Jan 13 21:50:27.083877 containerd[1511]: time="2025-01-13T21:50:27.083847992Z" level=info msg="RemovePodSandbox for \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\"" Jan 13 21:50:27.083967 containerd[1511]: time="2025-01-13T21:50:27.083900651Z" level=info msg="Forcibly stopping sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\"" Jan 13 21:50:27.084148 containerd[1511]: time="2025-01-13T21:50:27.084049732Z" level=info msg="TearDown network for sandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\" successfully" Jan 13 21:50:27.086700 containerd[1511]: time="2025-01-13T21:50:27.086656393Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:50:27.086839 containerd[1511]: time="2025-01-13T21:50:27.086708872Z" level=info msg="RemovePodSandbox \"b025fe8c4142ce7cdf3bd8ca57b2064ed8b192f520936562e18f3b215ace0cc8\" returns successfully" Jan 13 21:50:27.444284 kubelet[1917]: I0113 21:50:27.444053 1917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/test-pod-1" podStartSLOduration=17.100418805 podStartE2EDuration="17.444032047s" podCreationTimestamp="2025-01-13 21:50:10 +0000 UTC" firstStartedPulling="2025-01-13 21:50:26.472803977 +0000 UTC m=+60.169048953" lastFinishedPulling="2025-01-13 21:50:26.816417216 +0000 UTC m=+60.512662195" observedRunningTime="2025-01-13 21:50:27.443851588 +0000 UTC m=+61.140096574" watchObservedRunningTime="2025-01-13 21:50:27.444032047 +0000 UTC m=+61.140277042" Jan 13 21:50:27.933803 kubelet[1917]: E0113 21:50:27.933696 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:28.080648 systemd-networkd[1419]: cali5ec59c6bf6e: Gained IPv6LL Jan 13 21:50:28.934192 kubelet[1917]: E0113 21:50:28.934109 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:29.935230 kubelet[1917]: E0113 21:50:29.935149 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:30.935843 kubelet[1917]: E0113 21:50:30.935749 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:31.936631 kubelet[1917]: E0113 21:50:31.936525 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:32.118911 systemd[1]: run-containerd-runc-k8s.io-eff1cf7bacd02cff43066529bc97e6abeab7398862488ceabb152a3f7346cdad-runc.r622NT.mount: Deactivated successfully. Jan 13 21:50:32.937554 kubelet[1917]: E0113 21:50:32.937436 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:33.938062 kubelet[1917]: E0113 21:50:33.937978 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:34.938508 kubelet[1917]: E0113 21:50:34.938423 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Jan 13 21:50:35.938762 kubelet[1917]: E0113 21:50:35.938673 1917 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests"