Mar 17 18:00:57.067369 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT_DYNAMIC Mon Mar 17 16:09:25 -00 2025 Mar 17 18:00:57.067400 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=2a4a0f64c0160ed10b339be09fdc9d7e265b13f78aefc87616e79bf13c00bb1c Mar 17 18:00:57.067411 kernel: BIOS-provided physical RAM map: Mar 17 18:00:57.067418 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 17 18:00:57.067427 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Mar 17 18:00:57.067434 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable Mar 17 18:00:57.067443 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc8fff] reserved Mar 17 18:00:57.067450 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Mar 17 18:00:57.067461 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Mar 17 18:00:57.067468 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Mar 17 18:00:57.067476 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Mar 17 18:00:57.067483 kernel: printk: bootconsole [earlyser0] enabled Mar 17 18:00:57.067490 kernel: NX (Execute Disable) protection: active Mar 17 18:00:57.067499 kernel: APIC: Static calls initialized Mar 17 18:00:57.067510 kernel: efi: EFI v2.7 by Microsoft Mar 17 18:00:57.067520 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3ee73a98 RNG=0x3ffd1018 Mar 17 18:00:57.067527 kernel: random: crng init done Mar 17 18:00:57.067538 kernel: secureboot: Secure boot disabled Mar 17 18:00:57.067545 kernel: SMBIOS 3.1.0 present. Mar 17 18:00:57.067555 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 03/08/2024 Mar 17 18:00:57.067563 kernel: Hypervisor detected: Microsoft Hyper-V Mar 17 18:00:57.067570 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x64e24, misc 0xbed7b2 Mar 17 18:00:57.067580 kernel: Hyper-V: Host Build 10.0.20348.1799-1-0 Mar 17 18:00:57.067587 kernel: Hyper-V: Nested features: 0x1e0101 Mar 17 18:00:57.067599 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Mar 17 18:00:57.067606 kernel: Hyper-V: Using hypercall for remote TLB flush Mar 17 18:00:57.067616 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Mar 17 18:00:57.067624 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Mar 17 18:00:57.067634 kernel: tsc: Marking TSC unstable due to running on Hyper-V Mar 17 18:00:57.067642 kernel: tsc: Detected 2593.908 MHz processor Mar 17 18:00:57.067651 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 17 18:00:57.067660 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 17 18:00:57.067667 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x400000000 Mar 17 18:00:57.067680 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 17 18:00:57.067688 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 17 18:00:57.067698 kernel: e820: update [mem 0x40000000-0xffffffff] usable ==> reserved Mar 17 18:00:57.067705 kernel: last_pfn = 0x40000 max_arch_pfn = 0x400000000 Mar 17 18:00:57.067715 kernel: Using GB pages for direct mapping Mar 17 18:00:57.067722 kernel: ACPI: Early table checksum verification disabled Mar 17 18:00:57.067733 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Mar 17 18:00:57.067745 kernel: ACPI: XSDT 0x000000003FFF90E8 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067757 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067768 kernel: ACPI: DSDT 0x000000003FFD6000 01E184 (v02 MSFTVM DSDT01 00000001 MSFT 05000000) Mar 17 18:00:57.067775 kernel: ACPI: FACS 0x000000003FFFE000 000040 Mar 17 18:00:57.067787 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067794 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067806 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067816 kernel: ACPI: APIC 0x000000003FFD5000 000058 (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067827 kernel: ACPI: SRAT 0x000000003FFD4000 0002D0 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067834 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067845 kernel: ACPI: FPDT 0x000000003FFD2000 000034 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:00:57.067853 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Mar 17 18:00:57.067863 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff4183] Mar 17 18:00:57.067872 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Mar 17 18:00:57.067882 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Mar 17 18:00:57.067893 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Mar 17 18:00:57.067906 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Mar 17 18:00:57.067919 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5057] Mar 17 18:00:57.067934 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd42cf] Mar 17 18:00:57.067953 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Mar 17 18:00:57.067969 kernel: ACPI: Reserving FPDT table memory at [mem 0x3ffd2000-0x3ffd2033] Mar 17 18:00:57.067985 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 17 18:00:57.068004 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 17 18:00:57.068019 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 17 18:00:57.068034 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] hotplug Mar 17 18:00:57.068051 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2c0000000-0xfdfffffff] hotplug Mar 17 18:00:57.068066 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 17 18:00:57.068081 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 17 18:00:57.068097 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 17 18:00:57.068113 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 17 18:00:57.068136 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 17 18:00:57.068153 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 17 18:00:57.068170 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 17 18:00:57.068212 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 17 18:00:57.068229 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 17 18:00:57.068245 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000000-0x1ffffffffffff] hotplug Mar 17 18:00:57.068261 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2000000000000-0x3ffffffffffff] hotplug Mar 17 18:00:57.068277 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x4000000000000-0x7ffffffffffff] hotplug Mar 17 18:00:57.068291 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x8000000000000-0xfffffffffffff] hotplug Mar 17 18:00:57.068309 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00000000-0x2bfffffff] Mar 17 18:00:57.068323 kernel: NODE_DATA(0) allocated [mem 0x2bfffa000-0x2bfffffff] Mar 17 18:00:57.068339 kernel: Zone ranges: Mar 17 18:00:57.068361 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 17 18:00:57.068379 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 17 18:00:57.068396 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Mar 17 18:00:57.068410 kernel: Movable zone start for each node Mar 17 18:00:57.068426 kernel: Early memory node ranges Mar 17 18:00:57.068444 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 17 18:00:57.068458 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] Mar 17 18:00:57.068474 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Mar 17 18:00:57.068491 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Mar 17 18:00:57.068513 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Mar 17 18:00:57.068529 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 18:00:57.068546 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 17 18:00:57.068563 kernel: On node 0, zone DMA32: 190 pages in unavailable ranges Mar 17 18:00:57.068578 kernel: ACPI: PM-Timer IO Port: 0x408 Mar 17 18:00:57.068594 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] dfl dfl lint[0x1]) Mar 17 18:00:57.068611 kernel: IOAPIC[0]: apic_id 2, version 17, address 0xfec00000, GSI 0-23 Mar 17 18:00:57.068627 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 17 18:00:57.068639 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 17 18:00:57.068654 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Mar 17 18:00:57.068667 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 17 18:00:57.068681 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Mar 17 18:00:57.068694 kernel: Booting paravirtualized kernel on Hyper-V Mar 17 18:00:57.068708 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 17 18:00:57.068721 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 17 18:00:57.068734 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 17 18:00:57.068747 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 17 18:00:57.068759 kernel: pcpu-alloc: [0] 0 1 Mar 17 18:00:57.068775 kernel: Hyper-V: PV spinlocks enabled Mar 17 18:00:57.068787 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 17 18:00:57.068803 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=2a4a0f64c0160ed10b339be09fdc9d7e265b13f78aefc87616e79bf13c00bb1c Mar 17 18:00:57.068817 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 18:00:57.068829 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Mar 17 18:00:57.068842 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 18:00:57.068856 kernel: Fallback order for Node 0: 0 Mar 17 18:00:57.068870 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2062618 Mar 17 18:00:57.068887 kernel: Policy zone: Normal Mar 17 18:00:57.068912 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 18:00:57.068926 kernel: software IO TLB: area num 2. Mar 17 18:00:57.068944 kernel: Memory: 8075040K/8387460K available (14336K kernel code, 2303K rwdata, 22860K rodata, 43476K init, 1596K bss, 312164K reserved, 0K cma-reserved) Mar 17 18:00:57.068958 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 18:00:57.068972 kernel: ftrace: allocating 37910 entries in 149 pages Mar 17 18:00:57.068987 kernel: ftrace: allocated 149 pages with 4 groups Mar 17 18:00:57.069001 kernel: Dynamic Preempt: voluntary Mar 17 18:00:57.069015 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 18:00:57.069031 kernel: rcu: RCU event tracing is enabled. Mar 17 18:00:57.069046 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 18:00:57.069063 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 18:00:57.069077 kernel: Rude variant of Tasks RCU enabled. Mar 17 18:00:57.069092 kernel: Tracing variant of Tasks RCU enabled. Mar 17 18:00:57.069106 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 18:00:57.069121 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 18:00:57.069135 kernel: Using NULL legacy PIC Mar 17 18:00:57.069152 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Mar 17 18:00:57.069166 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 18:00:57.069180 kernel: Console: colour dummy device 80x25 Mar 17 18:00:57.069245 kernel: printk: console [tty1] enabled Mar 17 18:00:57.069260 kernel: printk: console [ttyS0] enabled Mar 17 18:00:57.069275 kernel: printk: bootconsole [earlyser0] disabled Mar 17 18:00:57.069289 kernel: ACPI: Core revision 20230628 Mar 17 18:00:57.069304 kernel: Failed to register legacy timer interrupt Mar 17 18:00:57.069318 kernel: APIC: Switch to symmetric I/O mode setup Mar 17 18:00:57.069336 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 17 18:00:57.069351 kernel: Hyper-V: Using IPI hypercalls Mar 17 18:00:57.069365 kernel: APIC: send_IPI() replaced with hv_send_ipi() Mar 17 18:00:57.069380 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Mar 17 18:00:57.069395 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Mar 17 18:00:57.069409 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Mar 17 18:00:57.069424 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Mar 17 18:00:57.069439 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Mar 17 18:00:57.069454 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 5187.81 BogoMIPS (lpj=2593908) Mar 17 18:00:57.069471 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Mar 17 18:00:57.069486 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Mar 17 18:00:57.069500 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 17 18:00:57.069514 kernel: Spectre V2 : Mitigation: Retpolines Mar 17 18:00:57.069529 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 17 18:00:57.069543 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 17 18:00:57.069558 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Mar 17 18:00:57.069572 kernel: RETBleed: Vulnerable Mar 17 18:00:57.069587 kernel: Speculative Store Bypass: Vulnerable Mar 17 18:00:57.069601 kernel: TAA: Vulnerable: Clear CPU buffers attempted, no microcode Mar 17 18:00:57.069618 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 17 18:00:57.069632 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 17 18:00:57.069646 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 17 18:00:57.069659 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 17 18:00:57.069673 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 17 18:00:57.069688 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 17 18:00:57.069701 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 17 18:00:57.069716 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 17 18:00:57.069729 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 17 18:00:57.069745 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 17 18:00:57.069760 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 17 18:00:57.069779 kernel: x86/fpu: Enabled xstate features 0xe7, context size is 2432 bytes, using 'compacted' format. Mar 17 18:00:57.069792 kernel: Freeing SMP alternatives memory: 32K Mar 17 18:00:57.069806 kernel: pid_max: default: 32768 minimum: 301 Mar 17 18:00:57.069820 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 18:00:57.069834 kernel: landlock: Up and running. Mar 17 18:00:57.069848 kernel: SELinux: Initializing. Mar 17 18:00:57.069862 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 17 18:00:57.069876 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 17 18:00:57.069891 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8272CL CPU @ 2.60GHz (family: 0x6, model: 0x55, stepping: 0x7) Mar 17 18:00:57.069906 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 18:00:57.069920 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 18:00:57.069939 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 17 18:00:57.069953 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Mar 17 18:00:57.069967 kernel: signal: max sigframe size: 3632 Mar 17 18:00:57.069981 kernel: rcu: Hierarchical SRCU implementation. Mar 17 18:00:57.069996 kernel: rcu: Max phase no-delay instances is 400. Mar 17 18:00:57.070010 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 17 18:00:57.070026 kernel: smp: Bringing up secondary CPUs ... Mar 17 18:00:57.070040 kernel: smpboot: x86: Booting SMP configuration: Mar 17 18:00:57.070055 kernel: .... node #0, CPUs: #1 Mar 17 18:00:57.070073 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Mar 17 18:00:57.070089 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Mar 17 18:00:57.070102 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 18:00:57.070116 kernel: smpboot: Max logical packages: 1 Mar 17 18:00:57.070130 kernel: smpboot: Total of 2 processors activated (10375.63 BogoMIPS) Mar 17 18:00:57.070144 kernel: devtmpfs: initialized Mar 17 18:00:57.070157 kernel: x86/mm: Memory block size: 128MB Mar 17 18:00:57.070172 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Mar 17 18:00:57.070205 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 18:00:57.070220 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 18:00:57.070234 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 18:00:57.070247 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 18:00:57.070262 kernel: audit: initializing netlink subsys (disabled) Mar 17 18:00:57.070276 kernel: audit: type=2000 audit(1742234455.027:1): state=initialized audit_enabled=0 res=1 Mar 17 18:00:57.070289 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 18:00:57.070303 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 17 18:00:57.070316 kernel: cpuidle: using governor menu Mar 17 18:00:57.070334 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 18:00:57.070348 kernel: dca service started, version 1.12.1 Mar 17 18:00:57.070362 kernel: e820: reserve RAM buffer [mem 0x3ff41000-0x3fffffff] Mar 17 18:00:57.070376 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 17 18:00:57.070390 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 18:00:57.070404 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 18:00:57.070418 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 18:00:57.070431 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 18:00:57.070443 kernel: ACPI: Added _OSI(Module Device) Mar 17 18:00:57.070459 kernel: ACPI: Added _OSI(Processor Device) Mar 17 18:00:57.070471 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 18:00:57.070482 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 18:00:57.070493 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 18:00:57.070505 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 17 18:00:57.070516 kernel: ACPI: Interpreter enabled Mar 17 18:00:57.070526 kernel: ACPI: PM: (supports S0 S5) Mar 17 18:00:57.070536 kernel: ACPI: Using IOAPIC for interrupt routing Mar 17 18:00:57.070547 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 17 18:00:57.070560 kernel: PCI: Ignoring E820 reservations for host bridge windows Mar 17 18:00:57.070571 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Mar 17 18:00:57.070582 kernel: iommu: Default domain type: Translated Mar 17 18:00:57.070593 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 17 18:00:57.070603 kernel: efivars: Registered efivars operations Mar 17 18:00:57.070613 kernel: PCI: Using ACPI for IRQ routing Mar 17 18:00:57.070624 kernel: PCI: System does not support PCI Mar 17 18:00:57.070632 kernel: vgaarb: loaded Mar 17 18:00:57.070640 kernel: clocksource: Switched to clocksource hyperv_clocksource_tsc_page Mar 17 18:00:57.070652 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 18:00:57.070660 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 18:00:57.070671 kernel: pnp: PnP ACPI init Mar 17 18:00:57.070679 kernel: pnp: PnP ACPI: found 3 devices Mar 17 18:00:57.070687 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 17 18:00:57.070696 kernel: NET: Registered PF_INET protocol family Mar 17 18:00:57.070704 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 17 18:00:57.070712 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Mar 17 18:00:57.070720 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 18:00:57.070731 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 18:00:57.070739 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Mar 17 18:00:57.070747 kernel: TCP: Hash tables configured (established 65536 bind 65536) Mar 17 18:00:57.070755 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 17 18:00:57.070766 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 17 18:00:57.070774 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 18:00:57.070786 kernel: NET: Registered PF_XDP protocol family Mar 17 18:00:57.070794 kernel: PCI: CLS 0 bytes, default 64 Mar 17 18:00:57.070802 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 17 18:00:57.070814 kernel: software IO TLB: mapped [mem 0x000000003ae73000-0x000000003ee73000] (64MB) Mar 17 18:00:57.070823 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 17 18:00:57.070832 kernel: Initialise system trusted keyrings Mar 17 18:00:57.070842 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Mar 17 18:00:57.070850 kernel: Key type asymmetric registered Mar 17 18:00:57.070861 kernel: Asymmetric key parser 'x509' registered Mar 17 18:00:57.070869 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 17 18:00:57.070880 kernel: io scheduler mq-deadline registered Mar 17 18:00:57.070888 kernel: io scheduler kyber registered Mar 17 18:00:57.070901 kernel: io scheduler bfq registered Mar 17 18:00:57.070910 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 17 18:00:57.070921 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 18:00:57.070929 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 17 18:00:57.070940 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Mar 17 18:00:57.070948 kernel: i8042: PNP: No PS/2 controller found. Mar 17 18:00:57.071100 kernel: rtc_cmos 00:02: registered as rtc0 Mar 17 18:00:57.071212 kernel: rtc_cmos 00:02: setting system clock to 2025-03-17T18:00:56 UTC (1742234456) Mar 17 18:00:57.071311 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Mar 17 18:00:57.071325 kernel: intel_pstate: CPU model not supported Mar 17 18:00:57.071336 kernel: efifb: probing for efifb Mar 17 18:00:57.071347 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 17 18:00:57.071357 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 17 18:00:57.071367 kernel: efifb: scrolling: redraw Mar 17 18:00:57.071378 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 17 18:00:57.071388 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 18:00:57.071397 kernel: fb0: EFI VGA frame buffer device Mar 17 18:00:57.071410 kernel: pstore: Using crash dump compression: deflate Mar 17 18:00:57.071419 kernel: pstore: Registered efi_pstore as persistent store backend Mar 17 18:00:57.071429 kernel: NET: Registered PF_INET6 protocol family Mar 17 18:00:57.071437 kernel: Segment Routing with IPv6 Mar 17 18:00:57.071448 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 18:00:57.071456 kernel: NET: Registered PF_PACKET protocol family Mar 17 18:00:57.071468 kernel: Key type dns_resolver registered Mar 17 18:00:57.071475 kernel: IPI shorthand broadcast: enabled Mar 17 18:00:57.071487 kernel: sched_clock: Marking stable (781003000, 52749000)->(1033403000, -199651000) Mar 17 18:00:57.071498 kernel: registered taskstats version 1 Mar 17 18:00:57.071509 kernel: Loading compiled-in X.509 certificates Mar 17 18:00:57.071518 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 2d438fc13e28f87f3f580874887bade2e2b0c7dd' Mar 17 18:00:57.071528 kernel: Key type .fscrypt registered Mar 17 18:00:57.071536 kernel: Key type fscrypt-provisioning registered Mar 17 18:00:57.071547 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 18:00:57.071555 kernel: ima: Allocated hash algorithm: sha1 Mar 17 18:00:57.071567 kernel: ima: No architecture policies found Mar 17 18:00:57.071575 kernel: clk: Disabling unused clocks Mar 17 18:00:57.071588 kernel: Freeing unused kernel image (initmem) memory: 43476K Mar 17 18:00:57.071596 kernel: Write protecting the kernel read-only data: 38912k Mar 17 18:00:57.071607 kernel: Freeing unused kernel image (rodata/data gap) memory: 1716K Mar 17 18:00:57.071616 kernel: Run /init as init process Mar 17 18:00:57.071626 kernel: with arguments: Mar 17 18:00:57.071634 kernel: /init Mar 17 18:00:57.071645 kernel: with environment: Mar 17 18:00:57.071653 kernel: HOME=/ Mar 17 18:00:57.071664 kernel: TERM=linux Mar 17 18:00:57.071675 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 18:00:57.071686 systemd[1]: Successfully made /usr/ read-only. Mar 17 18:00:57.071700 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 18:00:57.071713 systemd[1]: Detected virtualization microsoft. Mar 17 18:00:57.071724 systemd[1]: Detected architecture x86-64. Mar 17 18:00:57.071735 systemd[1]: Running in initrd. Mar 17 18:00:57.071746 systemd[1]: No hostname configured, using default hostname. Mar 17 18:00:57.071759 systemd[1]: Hostname set to . Mar 17 18:00:57.071769 systemd[1]: Initializing machine ID from random generator. Mar 17 18:00:57.071779 systemd[1]: Queued start job for default target initrd.target. Mar 17 18:00:57.071790 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 18:00:57.071800 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 18:00:57.071811 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 18:00:57.071822 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 18:00:57.071831 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 18:00:57.071845 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 18:00:57.071856 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 18:00:57.071867 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 18:00:57.071877 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 18:00:57.071887 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 18:00:57.071899 systemd[1]: Reached target paths.target - Path Units. Mar 17 18:00:57.071908 systemd[1]: Reached target slices.target - Slice Units. Mar 17 18:00:57.071922 systemd[1]: Reached target swap.target - Swaps. Mar 17 18:00:57.071931 systemd[1]: Reached target timers.target - Timer Units. Mar 17 18:00:57.071942 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 18:00:57.071951 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 18:00:57.071963 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 18:00:57.071972 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 17 18:00:57.071983 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 18:00:57.071993 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 18:00:57.072003 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 18:00:57.072018 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 18:00:57.072026 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 18:00:57.072038 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 18:00:57.072051 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 18:00:57.072061 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 18:00:57.072071 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 18:00:57.072082 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 18:00:57.072093 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 18:00:57.072125 systemd-journald[177]: Collecting audit messages is disabled. Mar 17 18:00:57.072148 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 18:00:57.072160 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 18:00:57.072173 systemd-journald[177]: Journal started Mar 17 18:00:57.074800 systemd-journald[177]: Runtime Journal (/run/log/journal/affaacbb03914fe59ee951b81e3f0c1a) is 8M, max 158.8M, 150.8M free. Mar 17 18:00:57.079263 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 18:00:57.081229 systemd-modules-load[179]: Inserted module 'overlay' Mar 17 18:00:57.087199 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 18:00:57.095614 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 18:00:57.108379 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 18:00:57.116329 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 18:00:57.126306 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 18:00:57.130903 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 18:00:57.138986 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 18:00:57.148333 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 18:00:57.148651 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 18:00:57.153063 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 18:00:57.161698 systemd-modules-load[179]: Inserted module 'br_netfilter' Mar 17 18:00:57.163715 kernel: Bridge firewalling registered Mar 17 18:00:57.170441 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 18:00:57.175401 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 18:00:57.182865 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 18:00:57.192661 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 18:00:57.198291 dracut-cmdline[209]: dracut-dracut-053 Mar 17 18:00:57.198291 dracut-cmdline[209]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=2a4a0f64c0160ed10b339be09fdc9d7e265b13f78aefc87616e79bf13c00bb1c Mar 17 18:00:57.220895 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 18:00:57.228371 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 18:00:57.270660 systemd-resolved[248]: Positive Trust Anchors: Mar 17 18:00:57.273175 systemd-resolved[248]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:00:57.277107 systemd-resolved[248]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 18:00:57.295156 systemd-resolved[248]: Defaulting to hostname 'linux'. Mar 17 18:00:57.300313 kernel: SCSI subsystem initialized Mar 17 18:00:57.296230 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 18:00:57.305309 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 18:00:57.314262 kernel: Loading iSCSI transport class v2.0-870. Mar 17 18:00:57.325206 kernel: iscsi: registered transport (tcp) Mar 17 18:00:57.346656 kernel: iscsi: registered transport (qla4xxx) Mar 17 18:00:57.346722 kernel: QLogic iSCSI HBA Driver Mar 17 18:00:57.381683 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 18:00:57.387447 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 18:00:57.414878 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 18:00:57.414942 kernel: device-mapper: uevent: version 1.0.3 Mar 17 18:00:57.417961 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 18:00:57.457224 kernel: raid6: avx512x4 gen() 18272 MB/s Mar 17 18:00:57.476203 kernel: raid6: avx512x2 gen() 18262 MB/s Mar 17 18:00:57.494215 kernel: raid6: avx512x1 gen() 18269 MB/s Mar 17 18:00:57.512198 kernel: raid6: avx2x4 gen() 18240 MB/s Mar 17 18:00:57.534211 kernel: raid6: avx2x2 gen() 18235 MB/s Mar 17 18:00:57.553770 kernel: raid6: avx2x1 gen() 13438 MB/s Mar 17 18:00:57.553823 kernel: raid6: using algorithm avx512x4 gen() 18272 MB/s Mar 17 18:00:57.574803 kernel: raid6: .... xor() 7735 MB/s, rmw enabled Mar 17 18:00:57.574850 kernel: raid6: using avx512x2 recovery algorithm Mar 17 18:00:57.597207 kernel: xor: automatically using best checksumming function avx Mar 17 18:00:57.737212 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 18:00:57.746974 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 18:00:57.760598 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 18:00:57.778057 systemd-udevd[399]: Using default interface naming scheme 'v255'. Mar 17 18:00:57.785173 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 18:00:57.798297 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 18:00:57.811120 dracut-pre-trigger[401]: rd.md=0: removing MD RAID activation Mar 17 18:00:57.835753 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 18:00:57.842449 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 18:00:57.886836 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 18:00:57.900384 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 18:00:57.923369 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 18:00:57.930094 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 18:00:57.933179 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 18:00:57.939437 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 18:00:57.953401 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 18:00:57.966212 kernel: cryptd: max_cpu_qlen set to 1000 Mar 17 18:00:57.993216 kernel: AVX2 version of gcm_enc/dec engaged. Mar 17 18:00:57.993285 kernel: AES CTR mode by8 optimization enabled Mar 17 18:00:57.994332 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 18:00:57.999839 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 18:00:58.000768 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 18:00:58.012293 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 18:00:58.019384 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 18:00:58.019576 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 18:00:58.022536 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 18:00:58.034435 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 18:00:58.047201 kernel: hv_vmbus: Vmbus version:5.2 Mar 17 18:00:58.058907 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 18:00:58.061641 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 18:00:58.069763 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 17 18:00:58.079258 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 17 18:00:58.088967 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 18:00:58.089003 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 18:00:58.091728 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 18:00:58.105059 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/VMBUS:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 17 18:00:58.109198 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 18:00:58.113589 kernel: hv_vmbus: registering driver hid_hyperv Mar 17 18:00:58.117100 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 17 18:00:58.117200 kernel: PTP clock support registered Mar 17 18:00:58.118321 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 18:00:58.129372 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 17 18:00:58.134892 kernel: hv_utils: Registering HyperV Utility Driver Mar 17 18:00:58.134942 kernel: hv_vmbus: registering driver hv_utils Mar 17 18:00:58.137032 kernel: hv_utils: Heartbeat IC version 3.0 Mar 17 18:00:58.141235 kernel: hv_utils: TimeSync IC version 4.0 Mar 17 18:00:58.141277 kernel: hv_utils: Shutdown IC version 3.2 Mar 17 18:00:58.141519 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 18:00:58.715979 systemd-resolved[248]: Clock change detected. Flushing caches. Mar 17 18:00:58.734979 kernel: hv_vmbus: registering driver hv_storvsc Mar 17 18:00:58.738702 kernel: scsi host1: storvsc_host_t Mar 17 18:00:58.738783 kernel: scsi host0: storvsc_host_t Mar 17 18:00:58.747231 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 17 18:00:58.747311 kernel: hv_vmbus: registering driver hv_netvsc Mar 17 18:00:58.756890 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 18:00:58.760641 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 Mar 17 18:00:58.779837 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 17 18:00:58.782193 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 18:00:58.782235 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 17 18:00:58.794658 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 17 18:00:58.812225 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 17 18:00:58.812430 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 18:00:58.812606 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 17 18:00:58.812793 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 17 18:00:58.812970 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 18:00:58.812992 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 18:00:58.942265 kernel: hv_netvsc 7c1e5279-177a-7c1e-5279-177a7c1e5279 eth0: VF slot 1 added Mar 17 18:00:58.952720 kernel: hv_vmbus: registering driver hv_pci Mar 17 18:00:58.952762 kernel: hv_pci 553771f8-c8c4-4c8f-885e-6246b1b275c2: PCI VMBus probing: Using version 0x10004 Mar 17 18:00:58.996774 kernel: hv_pci 553771f8-c8c4-4c8f-885e-6246b1b275c2: PCI host bridge to bus c8c4:00 Mar 17 18:00:58.997268 kernel: pci_bus c8c4:00: root bus resource [mem 0xfe0000000-0xfe00fffff window] Mar 17 18:00:58.997468 kernel: pci_bus c8c4:00: No busn resource found for root bus, will use [bus 00-ff] Mar 17 18:00:58.997638 kernel: pci c8c4:00:02.0: [15b3:1016] type 00 class 0x020000 Mar 17 18:00:58.997831 kernel: pci c8c4:00:02.0: reg 0x10: [mem 0xfe0000000-0xfe00fffff 64bit pref] Mar 17 18:00:58.998025 kernel: pci c8c4:00:02.0: enabling Extended Tags Mar 17 18:00:58.998200 kernel: pci c8c4:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at c8c4:00:02.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) Mar 17 18:00:58.998389 kernel: pci_bus c8c4:00: busn_res: [bus 00-ff] end is updated to 00 Mar 17 18:00:58.998538 kernel: pci c8c4:00:02.0: BAR 0: assigned [mem 0xfe0000000-0xfe00fffff 64bit pref] Mar 17 18:00:59.157571 kernel: mlx5_core c8c4:00:02.0: enabling device (0000 -> 0002) Mar 17 18:00:59.384726 kernel: mlx5_core c8c4:00:02.0: firmware version: 14.30.5000 Mar 17 18:00:59.384956 kernel: hv_netvsc 7c1e5279-177a-7c1e-5279-177a7c1e5279 eth0: VF registering: eth1 Mar 17 18:00:59.385399 kernel: mlx5_core c8c4:00:02.0 eth1: joined to eth0 Mar 17 18:00:59.385612 kernel: mlx5_core c8c4:00:02.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Mar 17 18:00:59.392230 kernel: mlx5_core c8c4:00:02.0 enP51396s1: renamed from eth1 Mar 17 18:00:59.531688 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 17 18:00:59.624233 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (457) Mar 17 18:00:59.641231 kernel: BTRFS: device fsid 16b3954e-2e86-4c7f-a948-d3d3817b1bdc devid 1 transid 42 /dev/sda3 scanned by (udev-worker) (465) Mar 17 18:00:59.651669 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 18:00:59.666840 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 17 18:00:59.669831 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 17 18:00:59.689606 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 18:00:59.724891 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 17 18:01:00.712320 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 18:01:00.713993 disk-uuid[604]: The operation has completed successfully. Mar 17 18:01:00.786357 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 18:01:00.786468 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 18:01:00.842411 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 18:01:00.850253 sh[693]: Success Mar 17 18:01:00.886875 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 17 18:01:01.174052 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 18:01:01.191282 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 18:01:01.195760 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 18:01:01.211222 kernel: BTRFS info (device dm-0): first mount of filesystem 16b3954e-2e86-4c7f-a948-d3d3817b1bdc Mar 17 18:01:01.211257 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:01:01.215528 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 18:01:01.218196 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 18:01:01.220439 kernel: BTRFS info (device dm-0): using free space tree Mar 17 18:01:01.711954 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 18:01:01.715278 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 18:01:01.722410 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 18:01:01.726344 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 18:01:01.751083 kernel: BTRFS info (device sda6): first mount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 18:01:01.751120 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:01:01.751136 kernel: BTRFS info (device sda6): using free space tree Mar 17 18:01:01.776606 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 18:01:01.789260 kernel: BTRFS info (device sda6): last unmount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 18:01:01.788794 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 18:01:01.798996 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 18:01:01.809392 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 18:01:01.822733 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 18:01:01.836352 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 18:01:01.859978 systemd-networkd[878]: lo: Link UP Mar 17 18:01:01.859987 systemd-networkd[878]: lo: Gained carrier Mar 17 18:01:01.862263 systemd-networkd[878]: Enumeration completed Mar 17 18:01:01.862471 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 18:01:01.864534 systemd-networkd[878]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 18:01:01.864539 systemd-networkd[878]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:01:01.866412 systemd[1]: Reached target network.target - Network. Mar 17 18:01:01.933226 kernel: mlx5_core c8c4:00:02.0 enP51396s1: Link up Mar 17 18:01:01.963278 kernel: hv_netvsc 7c1e5279-177a-7c1e-5279-177a7c1e5279 eth0: Data path switched to VF: enP51396s1 Mar 17 18:01:01.963755 systemd-networkd[878]: enP51396s1: Link UP Mar 17 18:01:01.963888 systemd-networkd[878]: eth0: Link UP Mar 17 18:01:01.964050 systemd-networkd[878]: eth0: Gained carrier Mar 17 18:01:01.964064 systemd-networkd[878]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 18:01:01.968408 systemd-networkd[878]: enP51396s1: Gained carrier Mar 17 18:01:01.985277 systemd-networkd[878]: eth0: DHCPv4 address 10.200.4.11/24, gateway 10.200.4.1 acquired from 168.63.129.16 Mar 17 18:01:02.975876 ignition[867]: Ignition 2.20.0 Mar 17 18:01:02.975889 ignition[867]: Stage: fetch-offline Mar 17 18:01:02.975929 ignition[867]: no configs at "/usr/lib/ignition/base.d" Mar 17 18:01:02.975939 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:01:02.976060 ignition[867]: parsed url from cmdline: "" Mar 17 18:01:02.976065 ignition[867]: no config URL provided Mar 17 18:01:02.976071 ignition[867]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 18:01:02.976082 ignition[867]: no config at "/usr/lib/ignition/user.ign" Mar 17 18:01:02.976089 ignition[867]: failed to fetch config: resource requires networking Mar 17 18:01:02.976390 ignition[867]: Ignition finished successfully Mar 17 18:01:02.995251 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 18:01:03.006390 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 17 18:01:03.022487 ignition[888]: Ignition 2.20.0 Mar 17 18:01:03.022499 ignition[888]: Stage: fetch Mar 17 18:01:03.022702 ignition[888]: no configs at "/usr/lib/ignition/base.d" Mar 17 18:01:03.022716 ignition[888]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:01:03.022819 ignition[888]: parsed url from cmdline: "" Mar 17 18:01:03.022822 ignition[888]: no config URL provided Mar 17 18:01:03.022828 ignition[888]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 18:01:03.022835 ignition[888]: no config at "/usr/lib/ignition/user.ign" Mar 17 18:01:03.022860 ignition[888]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 17 18:01:03.109376 ignition[888]: GET result: OK Mar 17 18:01:03.109464 ignition[888]: config has been read from IMDS userdata Mar 17 18:01:03.109480 ignition[888]: parsing config with SHA512: bd531d7be790d6df1daf4664e5852101302d72790588aa530158203e2a7c89115a75c643285f14d33911d96ea1307b26ad21bda30afa4919e6aede6a46c092d1 Mar 17 18:01:03.112931 unknown[888]: fetched base config from "system" Mar 17 18:01:03.113220 ignition[888]: fetch: fetch complete Mar 17 18:01:03.112938 unknown[888]: fetched base config from "system" Mar 17 18:01:03.113226 ignition[888]: fetch: fetch passed Mar 17 18:01:03.112943 unknown[888]: fetched user config from "azure" Mar 17 18:01:03.113267 ignition[888]: Ignition finished successfully Mar 17 18:01:03.124876 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 17 18:01:03.133345 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 18:01:03.150991 ignition[894]: Ignition 2.20.0 Mar 17 18:01:03.151003 ignition[894]: Stage: kargs Mar 17 18:01:03.153017 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 18:01:03.151239 ignition[894]: no configs at "/usr/lib/ignition/base.d" Mar 17 18:01:03.151255 ignition[894]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:01:03.151909 ignition[894]: kargs: kargs passed Mar 17 18:01:03.151951 ignition[894]: Ignition finished successfully Mar 17 18:01:03.171411 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 18:01:03.184387 ignition[900]: Ignition 2.20.0 Mar 17 18:01:03.184398 ignition[900]: Stage: disks Mar 17 18:01:03.186054 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 18:01:03.184601 ignition[900]: no configs at "/usr/lib/ignition/base.d" Mar 17 18:01:03.188877 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 18:01:03.184614 ignition[900]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:01:03.194291 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 18:01:03.185294 ignition[900]: disks: disks passed Mar 17 18:01:03.185334 ignition[900]: Ignition finished successfully Mar 17 18:01:03.212396 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 18:01:03.214993 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 18:01:03.219668 systemd[1]: Reached target basic.target - Basic System. Mar 17 18:01:03.228369 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 18:01:03.296258 systemd-fsck[908]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 17 18:01:03.300436 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 18:01:03.314701 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 18:01:03.406222 kernel: EXT4-fs (sda9): mounted filesystem 21764504-a65e-45eb-84e1-376b55b62aba r/w with ordered data mode. Quota mode: none. Mar 17 18:01:03.406809 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 18:01:03.411089 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 18:01:03.462346 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 18:01:03.467391 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 18:01:03.476285 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (919) Mar 17 18:01:03.482668 kernel: BTRFS info (device sda6): first mount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 18:01:03.482719 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:01:03.486840 kernel: BTRFS info (device sda6): using free space tree Mar 17 18:01:03.486109 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 17 18:01:03.492998 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 18:01:03.493370 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 18:01:03.493405 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 18:01:03.502815 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 18:01:03.507587 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 18:01:03.514360 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 18:01:03.922462 systemd-networkd[878]: enP51396s1: Gained IPv6LL Mar 17 18:01:03.986344 systemd-networkd[878]: eth0: Gained IPv6LL Mar 17 18:01:04.357520 coreos-metadata[921]: Mar 17 18:01:04.357 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 18:01:04.362701 coreos-metadata[921]: Mar 17 18:01:04.362 INFO Fetch successful Mar 17 18:01:04.365548 coreos-metadata[921]: Mar 17 18:01:04.362 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 17 18:01:04.371250 coreos-metadata[921]: Mar 17 18:01:04.371 INFO Fetch successful Mar 17 18:01:04.392073 coreos-metadata[921]: Mar 17 18:01:04.392 INFO wrote hostname ci-4230.1.0-a-d9de89fbd8 to /sysroot/etc/hostname Mar 17 18:01:04.394443 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 18:01:04.431574 initrd-setup-root[949]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 18:01:04.486089 initrd-setup-root[956]: cut: /sysroot/etc/group: No such file or directory Mar 17 18:01:04.491267 initrd-setup-root[963]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 18:01:04.495963 initrd-setup-root[970]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 18:01:05.710328 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 18:01:05.720310 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 18:01:05.727372 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 18:01:05.737548 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 18:01:05.743364 kernel: BTRFS info (device sda6): last unmount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 18:01:05.799232 ignition[1038]: INFO : Ignition 2.20.0 Mar 17 18:01:05.799232 ignition[1038]: INFO : Stage: mount Mar 17 18:01:05.799232 ignition[1038]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 18:01:05.799232 ignition[1038]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:01:05.809760 ignition[1038]: INFO : mount: mount passed Mar 17 18:01:05.809760 ignition[1038]: INFO : Ignition finished successfully Mar 17 18:01:05.800952 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 18:01:05.815544 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 18:01:05.822335 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 18:01:05.830860 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 18:01:05.847232 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (1050) Mar 17 18:01:05.853167 kernel: BTRFS info (device sda6): first mount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 18:01:05.853235 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:01:05.855443 kernel: BTRFS info (device sda6): using free space tree Mar 17 18:01:05.861252 kernel: BTRFS info (device sda6): auto enabling async discard Mar 17 18:01:05.862688 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 18:01:05.888023 ignition[1066]: INFO : Ignition 2.20.0 Mar 17 18:01:05.888023 ignition[1066]: INFO : Stage: files Mar 17 18:01:05.891842 ignition[1066]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 18:01:05.891842 ignition[1066]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:01:05.891842 ignition[1066]: DEBUG : files: compiled without relabeling support, skipping Mar 17 18:01:05.925838 ignition[1066]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 18:01:05.925838 ignition[1066]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 18:01:06.044003 ignition[1066]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 18:01:06.047518 ignition[1066]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 18:01:06.050772 unknown[1066]: wrote ssh authorized keys file for user: core Mar 17 18:01:06.053193 ignition[1066]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 18:01:06.093713 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/home/core/install.sh" Mar 17 18:01:06.098239 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 18:01:06.098239 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:01:06.098239 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:01:06.098239 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 17 18:01:06.098239 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 17 18:01:06.098239 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 17 18:01:06.098239 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 Mar 17 18:01:06.633856 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Mar 17 18:01:06.877485 ignition[1066]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 17 18:01:06.882416 ignition[1066]: INFO : files: createResultFile: createFiles: op(7): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:01:06.882416 ignition[1066]: INFO : files: createResultFile: createFiles: op(7): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:01:06.882416 ignition[1066]: INFO : files: files passed Mar 17 18:01:06.882416 ignition[1066]: INFO : Ignition finished successfully Mar 17 18:01:06.879092 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 18:01:06.904407 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 18:01:06.910267 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 18:01:06.922638 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 18:01:06.922759 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 18:01:06.935579 initrd-setup-root-after-ignition[1096]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 18:01:06.939664 initrd-setup-root-after-ignition[1100]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 18:01:06.943008 initrd-setup-root-after-ignition[1096]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 18:01:06.946500 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 18:01:06.949653 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 18:01:06.961383 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 18:01:06.986131 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 18:01:06.986267 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 18:01:06.991490 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 18:01:06.996068 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 18:01:06.998303 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 18:01:06.999350 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 18:01:07.016398 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 18:01:07.030649 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 18:01:07.045733 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 18:01:07.048612 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 18:01:07.056000 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 18:01:07.058144 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 18:01:07.058305 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 18:01:07.063045 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 18:01:07.069107 systemd[1]: Stopped target basic.target - Basic System. Mar 17 18:01:07.077307 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 18:01:07.082247 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 18:01:07.087383 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 18:01:07.092522 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 18:01:07.095147 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 18:01:07.100275 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 18:01:07.106476 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 18:01:07.109026 systemd[1]: Stopped target swap.target - Swaps. Mar 17 18:01:07.113265 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 18:01:07.119048 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 18:01:07.122011 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 18:01:07.128711 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 18:01:07.131637 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 18:01:07.133931 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 18:01:07.136994 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 18:01:07.146890 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 18:01:07.147485 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 18:01:07.147604 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 18:01:07.147809 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 18:01:07.147903 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 18:01:07.148162 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 17 18:01:07.148275 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 17 18:01:07.170012 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 18:01:07.181601 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 18:01:07.183587 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 18:01:07.184390 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 18:01:07.190514 ignition[1120]: INFO : Ignition 2.20.0 Mar 17 18:01:07.190514 ignition[1120]: INFO : Stage: umount Mar 17 18:01:07.190514 ignition[1120]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 18:01:07.190514 ignition[1120]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:01:07.190514 ignition[1120]: INFO : umount: umount passed Mar 17 18:01:07.190514 ignition[1120]: INFO : Ignition finished successfully Mar 17 18:01:07.195066 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 18:01:07.195242 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 18:01:07.206318 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 18:01:07.206427 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 18:01:07.216889 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 18:01:07.217019 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 18:01:07.219524 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 18:01:07.219600 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 18:01:07.233867 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 18:01:07.233923 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 17 18:01:07.236373 systemd[1]: Stopped target network.target - Network. Mar 17 18:01:07.236455 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 18:01:07.236503 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 18:01:07.236829 systemd[1]: Stopped target paths.target - Path Units. Mar 17 18:01:07.237173 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 18:01:07.244990 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 18:01:07.247641 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 18:01:07.249640 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 18:01:07.251958 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 18:01:07.252007 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 18:01:07.280026 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 18:01:07.280084 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 18:01:07.282768 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 18:01:07.282842 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 18:01:07.285287 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 18:01:07.285335 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 18:01:07.289966 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 18:01:07.294426 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 18:01:07.310314 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 18:01:07.312897 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 18:01:07.314883 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 18:01:07.327570 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 18:01:07.327688 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 18:01:07.335863 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 17 18:01:07.336052 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 18:01:07.336131 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 18:01:07.342521 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 17 18:01:07.343303 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 18:01:07.343379 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 18:01:07.354332 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 18:01:07.359855 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 18:01:07.359912 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 18:01:07.362743 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 18:01:07.362796 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 18:01:07.369411 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 18:01:07.369458 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 18:01:07.374626 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 18:01:07.374674 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 18:01:07.379421 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 18:01:07.398046 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 18:01:07.398128 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 17 18:01:07.406509 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 18:01:07.406690 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 18:01:07.415342 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 18:01:07.415427 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 18:01:07.422260 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 18:01:07.422307 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 18:01:07.427138 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 18:01:07.429357 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 18:01:07.431872 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 18:01:07.431918 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 18:01:07.442049 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 18:01:07.442116 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 18:01:07.454276 kernel: hv_netvsc 7c1e5279-177a-7c1e-5279-177a7c1e5279 eth0: Data path switched from VF: enP51396s1 Mar 17 18:01:07.459353 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 18:01:07.461732 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 18:01:07.461789 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 18:01:07.464457 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 17 18:01:07.464503 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 18:01:07.492768 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 18:01:07.492846 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 18:01:07.512665 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 18:01:07.512733 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 18:01:07.521364 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 17 18:01:07.521438 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 17 18:01:07.521810 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 18:01:07.521899 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 18:01:07.526857 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 18:01:07.526947 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 18:01:07.875195 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 18:01:07.875342 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 18:01:07.879812 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 18:01:07.884070 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 18:01:07.884141 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 18:01:07.898371 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 18:01:08.129716 systemd[1]: Switching root. Mar 17 18:01:08.220433 systemd-journald[177]: Journal stopped Mar 17 18:01:13.256526 systemd-journald[177]: Received SIGTERM from PID 1 (systemd). Mar 17 18:01:13.256565 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 18:01:13.256582 kernel: SELinux: policy capability open_perms=1 Mar 17 18:01:13.256597 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 18:01:13.256610 kernel: SELinux: policy capability always_check_network=0 Mar 17 18:01:13.256624 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 18:01:13.256640 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 18:01:13.256657 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 18:01:13.256671 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 18:01:13.256686 kernel: audit: type=1403 audit(1742234469.133:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 18:01:13.256701 systemd[1]: Successfully loaded SELinux policy in 176.708ms. Mar 17 18:01:13.256718 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.125ms. Mar 17 18:01:13.256735 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 18:01:13.256751 systemd[1]: Detected virtualization microsoft. Mar 17 18:01:13.256771 systemd[1]: Detected architecture x86-64. Mar 17 18:01:13.256787 systemd[1]: Detected first boot. Mar 17 18:01:13.256804 systemd[1]: Hostname set to . Mar 17 18:01:13.256820 systemd[1]: Initializing machine ID from random generator. Mar 17 18:01:13.256836 zram_generator::config[1167]: No configuration found. Mar 17 18:01:13.256855 kernel: Guest personality initialized and is inactive Mar 17 18:01:13.256872 kernel: VMCI host device registered (name=vmci, major=10, minor=124) Mar 17 18:01:13.256886 kernel: Initialized host personality Mar 17 18:01:13.256901 kernel: NET: Registered PF_VSOCK protocol family Mar 17 18:01:13.256916 systemd[1]: Populated /etc with preset unit settings. Mar 17 18:01:13.256934 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 17 18:01:13.256950 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 17 18:01:13.256966 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 17 18:01:13.256984 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 17 18:01:13.257001 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 18:01:13.257018 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 18:01:13.257035 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 18:01:13.257051 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 18:01:13.257069 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 18:01:13.257085 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 18:01:13.257104 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 18:01:13.257121 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 18:01:13.257138 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 18:01:13.257155 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 18:01:13.257172 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 18:01:13.257189 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 18:01:13.257218 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 18:01:13.257248 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 18:01:13.257265 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 17 18:01:13.257301 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 18:01:13.257318 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 17 18:01:13.257335 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 17 18:01:13.257353 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 17 18:01:13.257370 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 18:01:13.257387 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 18:01:13.257405 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 18:01:13.257424 systemd[1]: Reached target slices.target - Slice Units. Mar 17 18:01:13.257441 systemd[1]: Reached target swap.target - Swaps. Mar 17 18:01:13.257458 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 18:01:13.257476 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 18:01:13.257493 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 17 18:01:13.257511 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 18:01:13.257531 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 18:01:13.257549 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 18:01:13.257567 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 18:01:13.257584 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 18:01:13.257602 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 18:01:13.257620 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 18:01:13.257639 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:01:13.257659 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 18:01:13.257676 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 18:01:13.257694 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 18:01:13.257712 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 18:01:13.257730 systemd[1]: Reached target machines.target - Containers. Mar 17 18:01:13.257747 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 18:01:13.257765 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 18:01:13.257783 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 18:01:13.257803 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 18:01:13.257820 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 18:01:13.257838 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 18:01:13.257856 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 18:01:13.257873 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 18:01:13.257891 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 18:01:13.257908 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 18:01:13.257926 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 17 18:01:13.257946 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 17 18:01:13.257964 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 17 18:01:13.257981 systemd[1]: Stopped systemd-fsck-usr.service. Mar 17 18:01:13.257999 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 18:01:13.258018 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 18:01:13.258036 kernel: fuse: init (API version 7.39) Mar 17 18:01:13.258053 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 18:01:13.258070 kernel: loop: module loaded Mar 17 18:01:13.258089 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 18:01:13.258107 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 18:01:13.258124 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 17 18:01:13.258142 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 18:01:13.258159 systemd[1]: verity-setup.service: Deactivated successfully. Mar 17 18:01:13.258198 systemd-journald[1250]: Collecting audit messages is disabled. Mar 17 18:01:13.258244 systemd[1]: Stopped verity-setup.service. Mar 17 18:01:13.258263 systemd-journald[1250]: Journal started Mar 17 18:01:13.258297 systemd-journald[1250]: Runtime Journal (/run/log/journal/3c51d1378e3a42edab226311fab346fc) is 8M, max 158.8M, 150.8M free. Mar 17 18:01:12.630626 systemd[1]: Queued start job for default target multi-user.target. Mar 17 18:01:12.639328 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 17 18:01:12.639718 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 17 18:01:13.274773 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:01:13.274813 kernel: ACPI: bus type drm_connector registered Mar 17 18:01:13.287226 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 18:01:13.290039 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 18:01:13.292588 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 18:01:13.295462 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 18:01:13.297949 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 18:01:13.300681 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 18:01:13.303320 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 18:01:13.305874 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 18:01:13.309099 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 18:01:13.309353 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 18:01:13.312198 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:01:13.312410 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 18:01:13.315529 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 18:01:13.318569 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:01:13.318779 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 18:01:13.321500 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:01:13.321708 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 18:01:13.324863 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 18:01:13.325059 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 18:01:13.327904 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:01:13.328083 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 18:01:13.330790 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 18:01:13.333607 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 18:01:13.336591 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 18:01:13.349109 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 18:01:13.358868 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 18:01:13.372283 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 18:01:13.374945 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 18:01:13.374992 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 18:01:13.379091 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 17 18:01:13.389321 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 18:01:13.398381 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 18:01:13.401994 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 18:01:13.417710 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 18:01:13.425321 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 18:01:13.428100 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:01:13.429683 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 18:01:13.432282 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 18:01:13.433371 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 18:01:13.439995 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 18:01:13.447334 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 18:01:13.452685 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 17 18:01:13.457670 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 18:01:13.460790 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 18:01:13.470426 systemd-journald[1250]: Time spent on flushing to /var/log/journal/3c51d1378e3a42edab226311fab346fc is 35.561ms for 956 entries. Mar 17 18:01:13.470426 systemd-journald[1250]: System Journal (/var/log/journal/3c51d1378e3a42edab226311fab346fc) is 8M, max 2.6G, 2.6G free. Mar 17 18:01:13.518661 systemd-journald[1250]: Received client request to flush runtime journal. Mar 17 18:01:13.518717 kernel: loop0: detected capacity change from 0 to 218376 Mar 17 18:01:13.464866 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 18:01:13.467862 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 18:01:13.475729 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 18:01:13.483738 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 18:01:13.494087 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 17 18:01:13.510119 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 18:01:13.521458 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 18:01:13.532196 udevadm[1319]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 17 18:01:13.550081 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 18:01:13.564239 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 18:01:13.572765 systemd-tmpfiles[1310]: ACLs are not supported, ignoring. Mar 17 18:01:13.572787 systemd-tmpfiles[1310]: ACLs are not supported, ignoring. Mar 17 18:01:13.577943 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 18:01:13.587398 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 18:01:13.602145 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 17 18:01:13.607311 kernel: loop1: detected capacity change from 0 to 28272 Mar 17 18:01:13.641097 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 18:01:14.047234 kernel: loop2: detected capacity change from 0 to 138176 Mar 17 18:01:14.066262 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 18:01:14.073459 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 18:01:14.090159 systemd-tmpfiles[1332]: ACLs are not supported, ignoring. Mar 17 18:01:14.090182 systemd-tmpfiles[1332]: ACLs are not supported, ignoring. Mar 17 18:01:14.095707 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 18:01:14.754573 kernel: loop3: detected capacity change from 0 to 147912 Mar 17 18:01:15.463024 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 18:01:15.471510 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 18:01:15.498917 systemd-udevd[1337]: Using default interface naming scheme 'v255'. Mar 17 18:01:15.852235 kernel: loop4: detected capacity change from 0 to 218376 Mar 17 18:01:15.855311 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 18:01:15.864426 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 18:01:15.871282 kernel: loop5: detected capacity change from 0 to 28272 Mar 17 18:01:15.893848 kernel: loop6: detected capacity change from 0 to 138176 Mar 17 18:01:15.941254 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 17 18:01:15.950448 kernel: loop7: detected capacity change from 0 to 147912 Mar 17 18:01:15.977152 (sd-merge)[1339]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 17 18:01:15.978190 (sd-merge)[1339]: Merged extensions into '/usr'. Mar 17 18:01:16.023371 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 18:01:16.033639 systemd[1]: Reload requested from client PID 1308 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 18:01:16.033659 systemd[1]: Reloading... Mar 17 18:01:16.098238 kernel: hv_vmbus: registering driver hv_balloon Mar 17 18:01:16.120285 kernel: hv_vmbus: registering driver hyperv_fb Mar 17 18:01:16.135231 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 18:01:16.158322 zram_generator::config[1403]: No configuration found. Mar 17 18:01:16.164994 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 17 18:01:16.165071 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 17 18:01:16.170225 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 17 18:01:16.177568 kernel: Console: switching to colour dummy device 80x25 Mar 17 18:01:16.188102 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 18:01:16.415261 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 42 scanned by (udev-worker) (1359) Mar 17 18:01:16.493634 systemd-networkd[1344]: lo: Link UP Mar 17 18:01:16.494279 systemd-networkd[1344]: lo: Gained carrier Mar 17 18:01:16.500874 systemd-networkd[1344]: Enumeration completed Mar 17 18:01:16.501375 systemd-networkd[1344]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 18:01:16.501446 systemd-networkd[1344]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:01:16.632151 kernel: mlx5_core c8c4:00:02.0 enP51396s1: Link up Mar 17 18:01:16.642364 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:01:16.660891 kernel: hv_netvsc 7c1e5279-177a-7c1e-5279-177a7c1e5279 eth0: Data path switched to VF: enP51396s1 Mar 17 18:01:16.663032 systemd-networkd[1344]: enP51396s1: Link UP Mar 17 18:01:16.663278 systemd-networkd[1344]: eth0: Link UP Mar 17 18:01:16.663363 systemd-networkd[1344]: eth0: Gained carrier Mar 17 18:01:16.663436 systemd-networkd[1344]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 18:01:16.671761 systemd-networkd[1344]: enP51396s1: Gained carrier Mar 17 18:01:16.710262 systemd-networkd[1344]: eth0: DHCPv4 address 10.200.4.11/24, gateway 10.200.4.1 acquired from 168.63.129.16 Mar 17 18:01:16.748223 kernel: kvm_intel: Using Hyper-V Enlightened VMCS Mar 17 18:01:16.808368 systemd[1]: Reloading finished in 774 ms. Mar 17 18:01:16.834274 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 18:01:16.837351 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 18:01:16.840097 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 18:01:16.884653 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 17 18:01:16.900490 systemd[1]: Starting ensure-sysext.service... Mar 17 18:01:16.907300 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 18:01:16.915388 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 17 18:01:16.920682 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 18:01:16.926516 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 18:01:16.932591 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 18:01:16.963172 systemd-tmpfiles[1535]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 18:01:16.963591 systemd-tmpfiles[1535]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 18:01:16.970426 systemd-tmpfiles[1535]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 18:01:16.971137 systemd-tmpfiles[1535]: ACLs are not supported, ignoring. Mar 17 18:01:16.971337 systemd-tmpfiles[1535]: ACLs are not supported, ignoring. Mar 17 18:01:16.975294 systemd[1]: Reload requested from client PID 1531 ('systemctl') (unit ensure-sysext.service)... Mar 17 18:01:16.975316 systemd[1]: Reloading... Mar 17 18:01:16.981259 systemd-tmpfiles[1535]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 18:01:16.981273 systemd-tmpfiles[1535]: Skipping /boot Mar 17 18:01:16.998156 systemd-tmpfiles[1535]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 18:01:16.998261 systemd-tmpfiles[1535]: Skipping /boot Mar 17 18:01:17.072235 zram_generator::config[1571]: No configuration found. Mar 17 18:01:17.212407 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:01:17.336120 systemd[1]: Reloading finished in 360 ms. Mar 17 18:01:17.348151 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 18:01:17.365174 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 18:01:17.368669 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 17 18:01:17.372091 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 18:01:17.375569 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 18:01:17.393485 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 18:01:17.428479 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 18:01:17.433046 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 18:01:17.437986 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 18:01:17.446561 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 18:01:17.451829 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 18:01:17.459970 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:01:17.460577 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 18:01:17.465496 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 18:01:17.469464 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 18:01:17.483455 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 18:01:17.486011 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 18:01:17.486164 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 18:01:17.486324 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:01:17.487969 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:01:17.488668 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 18:01:17.492480 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:01:17.493154 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 18:01:17.497909 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:01:17.498099 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 18:01:17.511617 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 18:01:17.527919 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:01:17.528794 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 18:01:17.536740 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 18:01:17.537862 lvm[1643]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:01:17.551444 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 18:01:17.562663 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 18:01:17.568904 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 18:01:17.571467 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 18:01:17.571524 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 18:01:17.571603 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 18:01:17.575487 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:01:17.576251 systemd[1]: Finished ensure-sysext.service. Mar 17 18:01:17.578951 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:01:17.579171 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 18:01:17.585907 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:01:17.586123 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 18:01:17.601127 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:01:17.601440 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 18:01:17.604546 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:01:17.604691 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 18:01:17.607948 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:01:17.608062 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 18:01:17.638970 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 18:01:17.642705 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 18:01:17.650389 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 18:01:17.657493 lvm[1679]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:01:17.686073 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 18:01:17.703395 augenrules[1685]: No rules Mar 17 18:01:17.704719 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 18:01:17.704975 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 18:01:17.716068 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 18:01:17.733638 systemd-resolved[1645]: Positive Trust Anchors: Mar 17 18:01:17.733654 systemd-resolved[1645]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:01:17.733703 systemd-resolved[1645]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 18:01:17.737851 systemd-resolved[1645]: Using system hostname 'ci-4230.1.0-a-d9de89fbd8'. Mar 17 18:01:17.739570 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 18:01:17.742515 systemd[1]: Reached target network.target - Network. Mar 17 18:01:17.744945 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 18:01:17.810530 systemd-networkd[1344]: eth0: Gained IPv6LL Mar 17 18:01:17.813932 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 18:01:17.817572 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 18:01:18.706459 systemd-networkd[1344]: enP51396s1: Gained IPv6LL Mar 17 18:01:18.733726 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 18:01:18.737110 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:01:22.061505 ldconfig[1303]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 18:01:22.077432 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 18:01:22.085484 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 18:01:22.114190 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 18:01:22.117478 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 18:01:22.120319 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 18:01:22.123291 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 18:01:22.126517 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 18:01:22.129150 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 18:01:22.132027 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 18:01:22.134908 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 18:01:22.134968 systemd[1]: Reached target paths.target - Path Units. Mar 17 18:01:22.137283 systemd[1]: Reached target timers.target - Timer Units. Mar 17 18:01:22.140596 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 18:01:22.144253 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 18:01:22.148960 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 17 18:01:22.152177 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 17 18:01:22.154985 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 17 18:01:22.169999 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 18:01:22.172854 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 17 18:01:22.176192 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 18:01:22.178713 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 18:01:22.180813 systemd[1]: Reached target basic.target - Basic System. Mar 17 18:01:22.182982 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 18:01:22.183023 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 18:01:22.191297 systemd[1]: Starting chronyd.service - NTP client/server... Mar 17 18:01:22.195333 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 18:01:22.206381 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 17 18:01:22.210383 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 18:01:22.215400 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 18:01:22.223379 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 18:01:22.225911 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 18:01:22.225978 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy). Mar 17 18:01:22.227091 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 17 18:01:22.229812 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 17 18:01:22.232919 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:01:22.249529 jq[1703]: false Mar 17 18:01:22.257487 KVP[1708]: KVP starting; pid is:1708 Mar 17 18:01:22.250115 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 18:01:22.255380 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 18:01:22.268224 kernel: hv_utils: KVP IC version 4.0 Mar 17 18:01:22.265236 KVP[1708]: KVP LIC Version: 3.1 Mar 17 18:01:22.268392 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 18:01:22.281290 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 18:01:22.286770 (chronyd)[1699]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Mar 17 18:01:22.289421 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 18:01:22.296955 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 18:01:22.297539 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 18:01:22.306132 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 18:01:22.310936 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 18:01:22.318835 chronyd[1722]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Mar 17 18:01:22.323681 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 18:01:22.323960 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 18:01:22.326694 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 18:01:22.326960 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 18:01:22.329201 jq[1720]: true Mar 17 18:01:22.361281 jq[1725]: true Mar 17 18:01:22.371825 extend-filesystems[1705]: Found loop4 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found loop5 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found loop6 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found loop7 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda1 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda2 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda3 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found usr Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda4 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda6 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda7 Mar 17 18:01:22.374368 extend-filesystems[1705]: Found sda9 Mar 17 18:01:22.374368 extend-filesystems[1705]: Checking size of /dev/sda9 Mar 17 18:01:22.375146 chronyd[1722]: Timezone right/UTC failed leap second check, ignoring Mar 17 18:01:22.390250 systemd[1]: Started chronyd.service - NTP client/server. Mar 17 18:01:22.375318 chronyd[1722]: Loaded seccomp filter (level 2) Mar 17 18:01:22.405636 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 18:01:22.393290 dbus-daemon[1702]: [system] SELinux support is enabled Mar 17 18:01:22.418071 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 18:01:22.418118 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 18:01:22.421200 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 18:01:22.421230 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 18:01:22.456027 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 18:01:22.456318 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 18:01:22.461108 (ntainerd)[1738]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 18:01:22.492265 update_engine[1718]: I20250317 18:01:22.490989 1718 main.cc:92] Flatcar Update Engine starting Mar 17 18:01:22.496015 extend-filesystems[1705]: Old size kept for /dev/sda9 Mar 17 18:01:22.496015 extend-filesystems[1705]: Found sr0 Mar 17 18:01:22.500768 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 18:01:22.501059 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 18:01:22.510699 systemd[1]: Started update-engine.service - Update Engine. Mar 17 18:01:22.518506 update_engine[1718]: I20250317 18:01:22.518139 1718 update_check_scheduler.cc:74] Next update check in 7m33s Mar 17 18:01:22.521390 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 18:01:22.527330 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 18:01:22.531800 coreos-metadata[1701]: Mar 17 18:01:22.530 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 18:01:22.538077 systemd-logind[1716]: New seat seat0. Mar 17 18:01:22.540291 systemd-logind[1716]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 17 18:01:22.540472 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 18:01:22.559233 coreos-metadata[1701]: Mar 17 18:01:22.557 INFO Fetch successful Mar 17 18:01:22.559233 coreos-metadata[1701]: Mar 17 18:01:22.557 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 17 18:01:22.585357 coreos-metadata[1701]: Mar 17 18:01:22.585 INFO Fetch successful Mar 17 18:01:22.587202 coreos-metadata[1701]: Mar 17 18:01:22.587 INFO Fetching http://168.63.129.16/machine/34c34add-747f-4751-9e2d-21936cd94a67/c511bdf3%2Dc0c4%2D483f%2D8472%2Dbe380353fe3d.%5Fci%2D4230.1.0%2Da%2Dd9de89fbd8?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 17 18:01:22.590775 coreos-metadata[1701]: Mar 17 18:01:22.590 INFO Fetch successful Mar 17 18:01:22.590775 coreos-metadata[1701]: Mar 17 18:01:22.590 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 17 18:01:22.601443 coreos-metadata[1701]: Mar 17 18:01:22.601 INFO Fetch successful Mar 17 18:01:22.639566 bash[1774]: Updated "/home/core/.ssh/authorized_keys" Mar 17 18:01:22.641691 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 18:01:22.661627 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 17 18:01:22.672539 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 17 18:01:22.675726 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 18:01:22.709250 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 42 scanned by (udev-worker) (1777) Mar 17 18:01:22.961980 locksmithd[1766]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 18:01:22.963771 sshd_keygen[1726]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 18:01:22.990700 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 18:01:23.004488 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 18:01:23.008517 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 17 18:01:23.015925 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 18:01:23.016224 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 18:01:23.024349 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 18:01:23.046282 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 18:01:23.059721 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 18:01:23.063956 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 17 18:01:23.067066 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 18:01:23.071501 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 17 18:01:23.886594 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:01:23.891646 (kubelet)[1874]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 18:01:23.992148 containerd[1738]: time="2025-03-17T18:01:23.991390500Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 18:01:24.024607 containerd[1738]: time="2025-03-17T18:01:24.024134300Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:01:24.026032 containerd[1738]: time="2025-03-17T18:01:24.025991600Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:01:24.026032 containerd[1738]: time="2025-03-17T18:01:24.026028300Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 18:01:24.026166 containerd[1738]: time="2025-03-17T18:01:24.026049300Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 18:01:24.026372 containerd[1738]: time="2025-03-17T18:01:24.026266600Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 18:01:24.026372 containerd[1738]: time="2025-03-17T18:01:24.026313000Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 18:01:24.026483 containerd[1738]: time="2025-03-17T18:01:24.026406100Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:01:24.026483 containerd[1738]: time="2025-03-17T18:01:24.026423400Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.026695400Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.026727200Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.026747400Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.026760800Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.026863700Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.027088300Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.027309400Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.027330600Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.027425900Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 18:01:24.027615 containerd[1738]: time="2025-03-17T18:01:24.027476000Z" level=info msg="metadata content store policy set" policy=shared Mar 17 18:01:24.082072 containerd[1738]: time="2025-03-17T18:01:24.082019200Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 18:01:24.082236 containerd[1738]: time="2025-03-17T18:01:24.082101100Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 18:01:24.082236 containerd[1738]: time="2025-03-17T18:01:24.082124500Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 18:01:24.082236 containerd[1738]: time="2025-03-17T18:01:24.082157000Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 18:01:24.082236 containerd[1738]: time="2025-03-17T18:01:24.082178500Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.082496000Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.082891400Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083013600Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083034000Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083054600Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083074400Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083092000Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083109500Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083128600Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083154000Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083163 containerd[1738]: time="2025-03-17T18:01:24.083170000Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083186400Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083201600Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083244900Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083263200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083278800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083297800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083315400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083334100Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083351300Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083368700Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083388000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083408500Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083427100Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083444700Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.083584 containerd[1738]: time="2025-03-17T18:01:24.083461600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083480300Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083508900Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083528400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083543900Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083618000Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083644300Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083668900Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083687100Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083701700Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083718600Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083733500Z" level=info msg="NRI interface is disabled by configuration." Mar 17 18:01:24.084066 containerd[1738]: time="2025-03-17T18:01:24.083748800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 18:01:24.084548 containerd[1738]: time="2025-03-17T18:01:24.084139300Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 18:01:24.084548 containerd[1738]: time="2025-03-17T18:01:24.084269100Z" level=info msg="Connect containerd service" Mar 17 18:01:24.084548 containerd[1738]: time="2025-03-17T18:01:24.084319800Z" level=info msg="using legacy CRI server" Mar 17 18:01:24.084548 containerd[1738]: time="2025-03-17T18:01:24.084334400Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 18:01:24.084548 containerd[1738]: time="2025-03-17T18:01:24.084492200Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 18:01:24.086233 containerd[1738]: time="2025-03-17T18:01:24.086179700Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 18:01:24.087694 containerd[1738]: time="2025-03-17T18:01:24.087648700Z" level=info msg="Start subscribing containerd event" Mar 17 18:01:24.087794 containerd[1738]: time="2025-03-17T18:01:24.087710000Z" level=info msg="Start recovering state" Mar 17 18:01:24.087794 containerd[1738]: time="2025-03-17T18:01:24.087780100Z" level=info msg="Start event monitor" Mar 17 18:01:24.087864 containerd[1738]: time="2025-03-17T18:01:24.087795400Z" level=info msg="Start snapshots syncer" Mar 17 18:01:24.087864 containerd[1738]: time="2025-03-17T18:01:24.087807600Z" level=info msg="Start cni network conf syncer for default" Mar 17 18:01:24.087864 containerd[1738]: time="2025-03-17T18:01:24.087817600Z" level=info msg="Start streaming server" Mar 17 18:01:24.088168 containerd[1738]: time="2025-03-17T18:01:24.088138100Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 18:01:24.088317 containerd[1738]: time="2025-03-17T18:01:24.088300800Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 18:01:24.088467 containerd[1738]: time="2025-03-17T18:01:24.088451200Z" level=info msg="containerd successfully booted in 0.098126s" Mar 17 18:01:24.088555 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 18:01:24.095756 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 18:01:24.101645 systemd[1]: Startup finished in 1.068s (firmware) + 35.341s (loader) + 920ms (kernel) + 11.692s (initrd) + 15.143s (userspace) = 1min 4.166s. Mar 17 18:01:24.616039 kubelet[1874]: E0317 18:01:24.615987 1874 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:01:24.618316 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:01:24.618504 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:01:24.618894 systemd[1]: kubelet.service: Consumed 916ms CPU time, 251.7M memory peak. Mar 17 18:01:24.711100 login[1863]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 17 18:01:24.711739 login[1862]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 17 18:01:24.721983 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 18:01:24.727474 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 18:01:24.731025 systemd-logind[1716]: New session 2 of user core. Mar 17 18:01:24.739954 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 18:01:24.745592 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 18:01:24.749882 (systemd)[1892]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:01:24.751961 systemd-logind[1716]: New session c1 of user core. Mar 17 18:01:24.934644 systemd[1892]: Queued start job for default target default.target. Mar 17 18:01:24.941274 systemd[1892]: Created slice app.slice - User Application Slice. Mar 17 18:01:24.941308 systemd[1892]: Reached target paths.target - Paths. Mar 17 18:01:24.941356 systemd[1892]: Reached target timers.target - Timers. Mar 17 18:01:24.942572 systemd[1892]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 18:01:24.953196 systemd[1892]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 18:01:24.953291 systemd[1892]: Reached target sockets.target - Sockets. Mar 17 18:01:24.953341 systemd[1892]: Reached target basic.target - Basic System. Mar 17 18:01:24.953383 systemd[1892]: Reached target default.target - Main User Target. Mar 17 18:01:24.953415 systemd[1892]: Startup finished in 195ms. Mar 17 18:01:24.953893 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 18:01:24.961349 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 18:01:25.437761 waagent[1864]: 2025-03-17T18:01:25.437655Z INFO Daemon Daemon Azure Linux Agent Version: 2.9.1.1 Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.438094Z INFO Daemon Daemon OS: flatcar 4230.1.0 Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.438917Z INFO Daemon Daemon Python: 3.11.11 Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.440149Z INFO Daemon Daemon Run daemon Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.440933Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4230.1.0' Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.441598Z INFO Daemon Daemon Using waagent for provisioning Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.442145Z INFO Daemon Daemon Activate resource disk Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.442862Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.448589Z INFO Daemon Daemon Found device: None Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.449175Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.449926Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.451112Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 18:01:25.470376 waagent[1864]: 2025-03-17T18:01:25.451901Z INFO Daemon Daemon Running default provisioning handler Mar 17 18:01:25.473754 waagent[1864]: 2025-03-17T18:01:25.473509Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 17 18:01:25.486510 waagent[1864]: 2025-03-17T18:01:25.474347Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 17 18:01:25.486510 waagent[1864]: 2025-03-17T18:01:25.474602Z INFO Daemon Daemon cloud-init is enabled: False Mar 17 18:01:25.486510 waagent[1864]: 2025-03-17T18:01:25.475485Z INFO Daemon Daemon Copying ovf-env.xml Mar 17 18:01:25.541554 waagent[1864]: 2025-03-17T18:01:25.541458Z INFO Daemon Daemon Successfully mounted dvd Mar 17 18:01:25.579550 waagent[1864]: 2025-03-17T18:01:25.574146Z INFO Daemon Daemon Detect protocol endpoint Mar 17 18:01:25.579550 waagent[1864]: 2025-03-17T18:01:25.574514Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 18:01:25.579550 waagent[1864]: 2025-03-17T18:01:25.575380Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 17 18:01:25.579550 waagent[1864]: 2025-03-17T18:01:25.576119Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 17 18:01:25.579550 waagent[1864]: 2025-03-17T18:01:25.577009Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 17 18:01:25.579550 waagent[1864]: 2025-03-17T18:01:25.577634Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 17 18:01:25.587726 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 17 18:01:25.631998 waagent[1864]: 2025-03-17T18:01:25.631938Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 17 18:01:25.638890 waagent[1864]: 2025-03-17T18:01:25.632426Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 17 18:01:25.638890 waagent[1864]: 2025-03-17T18:01:25.633018Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 17 18:01:25.713285 login[1863]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 17 18:01:25.719367 systemd-logind[1716]: New session 1 of user core. Mar 17 18:01:25.725357 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 18:01:25.831281 waagent[1864]: 2025-03-17T18:01:25.831158Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 17 18:01:25.834513 waagent[1864]: 2025-03-17T18:01:25.834449Z INFO Daemon Daemon Forcing an update of the goal state. Mar 17 18:01:25.840394 waagent[1864]: 2025-03-17T18:01:25.840339Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 17 18:01:25.856112 waagent[1864]: 2025-03-17T18:01:25.856053Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.166 Mar 17 18:01:25.869529 waagent[1864]: 2025-03-17T18:01:25.856732Z INFO Daemon Mar 17 18:01:25.869529 waagent[1864]: 2025-03-17T18:01:25.857485Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: 9cca0d5e-2681-487b-9fe8-4092a7271047 eTag: 8801070298294225747 source: Fabric] Mar 17 18:01:25.869529 waagent[1864]: 2025-03-17T18:01:25.858475Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 17 18:01:25.869529 waagent[1864]: 2025-03-17T18:01:25.859431Z INFO Daemon Mar 17 18:01:25.869529 waagent[1864]: 2025-03-17T18:01:25.860117Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 17 18:01:25.872386 waagent[1864]: 2025-03-17T18:01:25.872345Z INFO Daemon Daemon Downloading artifacts profile blob Mar 17 18:01:25.981435 waagent[1864]: 2025-03-17T18:01:25.981321Z INFO Daemon Downloaded certificate {'thumbprint': '8744BB52BA684F87321E6E6C3FBBEF8E6C239273', 'hasPrivateKey': True} Mar 17 18:01:25.986498 waagent[1864]: 2025-03-17T18:01:25.986442Z INFO Daemon Fetch goal state completed Mar 17 18:01:26.019739 waagent[1864]: 2025-03-17T18:01:26.019666Z INFO Daemon Daemon Starting provisioning Mar 17 18:01:26.022462 waagent[1864]: 2025-03-17T18:01:26.022334Z INFO Daemon Daemon Handle ovf-env.xml. Mar 17 18:01:26.024784 waagent[1864]: 2025-03-17T18:01:26.022553Z INFO Daemon Daemon Set hostname [ci-4230.1.0-a-d9de89fbd8] Mar 17 18:01:26.060557 waagent[1864]: 2025-03-17T18:01:26.060472Z INFO Daemon Daemon Publish hostname [ci-4230.1.0-a-d9de89fbd8] Mar 17 18:01:26.068956 waagent[1864]: 2025-03-17T18:01:26.060989Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 17 18:01:26.068956 waagent[1864]: 2025-03-17T18:01:26.062413Z INFO Daemon Daemon Primary interface is [eth0] Mar 17 18:01:26.071990 systemd-networkd[1344]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 18:01:26.072000 systemd-networkd[1344]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:01:26.072049 systemd-networkd[1344]: eth0: DHCP lease lost Mar 17 18:01:26.073115 waagent[1864]: 2025-03-17T18:01:26.073055Z INFO Daemon Daemon Create user account if not exists Mar 17 18:01:26.087625 waagent[1864]: 2025-03-17T18:01:26.075573Z INFO Daemon Daemon User core already exists, skip useradd Mar 17 18:01:26.087625 waagent[1864]: 2025-03-17T18:01:26.076120Z INFO Daemon Daemon Configure sudoer Mar 17 18:01:26.087625 waagent[1864]: 2025-03-17T18:01:26.077307Z INFO Daemon Daemon Configure sshd Mar 17 18:01:26.087625 waagent[1864]: 2025-03-17T18:01:26.078446Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 17 18:01:26.087625 waagent[1864]: 2025-03-17T18:01:26.078923Z INFO Daemon Daemon Deploy ssh public key. Mar 17 18:01:26.114268 systemd-networkd[1344]: eth0: DHCPv4 address 10.200.4.11/24, gateway 10.200.4.1 acquired from 168.63.129.16 Mar 17 18:01:27.232494 waagent[1864]: 2025-03-17T18:01:27.232420Z INFO Daemon Daemon Provisioning complete Mar 17 18:01:27.244754 waagent[1864]: 2025-03-17T18:01:27.244686Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 17 18:01:27.247555 waagent[1864]: 2025-03-17T18:01:27.247497Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 17 18:01:27.251519 waagent[1864]: 2025-03-17T18:01:27.251463Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.9.1.1 is the most current agent Mar 17 18:01:27.375223 waagent[1941]: 2025-03-17T18:01:27.375113Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.9.1.1) Mar 17 18:01:27.375651 waagent[1941]: 2025-03-17T18:01:27.375296Z INFO ExtHandler ExtHandler OS: flatcar 4230.1.0 Mar 17 18:01:27.375651 waagent[1941]: 2025-03-17T18:01:27.375379Z INFO ExtHandler ExtHandler Python: 3.11.11 Mar 17 18:01:27.960894 waagent[1941]: 2025-03-17T18:01:27.960783Z INFO ExtHandler ExtHandler Distro: flatcar-4230.1.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.11; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Mar 17 18:01:27.961165 waagent[1941]: 2025-03-17T18:01:27.961103Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:01:27.961304 waagent[1941]: 2025-03-17T18:01:27.961250Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:01:27.970041 waagent[1941]: 2025-03-17T18:01:27.969965Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 17 18:01:27.976231 waagent[1941]: 2025-03-17T18:01:27.976169Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.166 Mar 17 18:01:27.976710 waagent[1941]: 2025-03-17T18:01:27.976656Z INFO ExtHandler Mar 17 18:01:27.976792 waagent[1941]: 2025-03-17T18:01:27.976747Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 4022996e-f9a8-44d8-bc18-c77bfeefce05 eTag: 8801070298294225747 source: Fabric] Mar 17 18:01:27.977103 waagent[1941]: 2025-03-17T18:01:27.977055Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 18:01:28.126231 waagent[1941]: 2025-03-17T18:01:28.126096Z INFO ExtHandler Mar 17 18:01:28.126448 waagent[1941]: 2025-03-17T18:01:28.126373Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 17 18:01:28.131369 waagent[1941]: 2025-03-17T18:01:28.131314Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 18:01:28.191805 waagent[1941]: 2025-03-17T18:01:28.191728Z INFO ExtHandler Downloaded certificate {'thumbprint': '8744BB52BA684F87321E6E6C3FBBEF8E6C239273', 'hasPrivateKey': True} Mar 17 18:01:28.192319 waagent[1941]: 2025-03-17T18:01:28.192260Z INFO ExtHandler Fetch goal state completed Mar 17 18:01:28.204789 waagent[1941]: 2025-03-17T18:01:28.204723Z INFO ExtHandler ExtHandler WALinuxAgent-2.9.1.1 running as process 1941 Mar 17 18:01:28.204943 waagent[1941]: 2025-03-17T18:01:28.204892Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 17 18:01:28.206481 waagent[1941]: 2025-03-17T18:01:28.206422Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['flatcar', '4230.1.0', '', 'Flatcar Container Linux by Kinvolk'] Mar 17 18:01:28.206833 waagent[1941]: 2025-03-17T18:01:28.206786Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 17 18:01:28.250973 waagent[1941]: 2025-03-17T18:01:28.250869Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 17 18:01:28.251146 waagent[1941]: 2025-03-17T18:01:28.251092Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 17 18:01:28.258138 waagent[1941]: 2025-03-17T18:01:28.257829Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 17 18:01:28.264587 systemd[1]: Reload requested from client PID 1954 ('systemctl') (unit waagent.service)... Mar 17 18:01:28.264607 systemd[1]: Reloading... Mar 17 18:01:28.355250 zram_generator::config[1996]: No configuration found. Mar 17 18:01:28.480661 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:01:28.593719 systemd[1]: Reloading finished in 328 ms. Mar 17 18:01:28.612734 waagent[1941]: 2025-03-17T18:01:28.612282Z INFO ExtHandler ExtHandler Executing systemctl daemon-reload for setting up waagent-network-setup.service Mar 17 18:01:28.622602 systemd[1]: Reload requested from client PID 2050 ('systemctl') (unit waagent.service)... Mar 17 18:01:28.622619 systemd[1]: Reloading... Mar 17 18:01:28.705236 zram_generator::config[2085]: No configuration found. Mar 17 18:01:28.835748 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:01:28.945688 systemd[1]: Reloading finished in 322 ms. Mar 17 18:01:28.963287 waagent[1941]: 2025-03-17T18:01:28.961120Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 17 18:01:28.963287 waagent[1941]: 2025-03-17T18:01:28.961358Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 17 18:01:29.350954 waagent[1941]: 2025-03-17T18:01:29.350852Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 17 18:01:29.351652 waagent[1941]: 2025-03-17T18:01:29.351570Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: configuration enabled [True], cgroups enabled [False], python supported: [True] Mar 17 18:01:29.352577 waagent[1941]: 2025-03-17T18:01:29.352512Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 17 18:01:29.353244 waagent[1941]: 2025-03-17T18:01:29.353167Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:01:29.353337 waagent[1941]: 2025-03-17T18:01:29.353243Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 17 18:01:29.353637 waagent[1941]: 2025-03-17T18:01:29.353563Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 17 18:01:29.353773 waagent[1941]: 2025-03-17T18:01:29.353709Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 17 18:01:29.354832 waagent[1941]: 2025-03-17T18:01:29.354755Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:01:29.354936 waagent[1941]: 2025-03-17T18:01:29.354842Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:01:29.355054 waagent[1941]: 2025-03-17T18:01:29.355014Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:01:29.355200 waagent[1941]: 2025-03-17T18:01:29.355151Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 17 18:01:29.355557 waagent[1941]: 2025-03-17T18:01:29.355482Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 17 18:01:29.355635 waagent[1941]: 2025-03-17T18:01:29.355590Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 17 18:01:29.356000 waagent[1941]: 2025-03-17T18:01:29.355937Z INFO EnvHandler ExtHandler Configure routes Mar 17 18:01:29.356119 waagent[1941]: 2025-03-17T18:01:29.356071Z INFO EnvHandler ExtHandler Gateway:None Mar 17 18:01:29.356280 waagent[1941]: 2025-03-17T18:01:29.356236Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 17 18:01:29.356622 waagent[1941]: 2025-03-17T18:01:29.356578Z INFO EnvHandler ExtHandler Routes:None Mar 17 18:01:29.356735 waagent[1941]: 2025-03-17T18:01:29.356662Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 17 18:01:29.356735 waagent[1941]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 17 18:01:29.356735 waagent[1941]: eth0 00000000 0104C80A 0003 0 0 1024 00000000 0 0 0 Mar 17 18:01:29.356735 waagent[1941]: eth0 0004C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 17 18:01:29.356735 waagent[1941]: eth0 0104C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:01:29.356735 waagent[1941]: eth0 10813FA8 0104C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:01:29.356735 waagent[1941]: eth0 FEA9FEA9 0104C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:01:29.369449 waagent[1941]: 2025-03-17T18:01:29.368236Z INFO ExtHandler ExtHandler Mar 17 18:01:29.369449 waagent[1941]: 2025-03-17T18:01:29.368348Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: e9fb139f-c9cb-4f06-9473-bebab487c1cb correlation bfee555b-a980-491e-aa0c-818f1c1e6bc8 created: 2025-03-17T18:00:06.830170Z] Mar 17 18:01:29.369449 waagent[1941]: 2025-03-17T18:01:29.368847Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 18:01:29.370798 waagent[1941]: 2025-03-17T18:01:29.369899Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 1 ms] Mar 17 18:01:29.401828 waagent[1941]: 2025-03-17T18:01:29.401763Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.9.1.1 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 8CE52F84-E25B-4E1A-BE85-F3E8D6043664;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0] Mar 17 18:01:29.424916 waagent[1941]: 2025-03-17T18:01:29.424851Z INFO MonitorHandler ExtHandler Network interfaces: Mar 17 18:01:29.424916 waagent[1941]: Executing ['ip', '-a', '-o', 'link']: Mar 17 18:01:29.424916 waagent[1941]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 17 18:01:29.424916 waagent[1941]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:79:17:7a brd ff:ff:ff:ff:ff:ff Mar 17 18:01:29.424916 waagent[1941]: 3: enP51396s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:79:17:7a brd ff:ff:ff:ff:ff:ff\ altname enP51396p0s2 Mar 17 18:01:29.424916 waagent[1941]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 17 18:01:29.424916 waagent[1941]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 17 18:01:29.424916 waagent[1941]: 2: eth0 inet 10.200.4.11/24 metric 1024 brd 10.200.4.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 17 18:01:29.424916 waagent[1941]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 17 18:01:29.424916 waagent[1941]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 17 18:01:29.424916 waagent[1941]: 2: eth0 inet6 fe80::7e1e:52ff:fe79:177a/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 17 18:01:29.424916 waagent[1941]: 3: enP51396s1 inet6 fe80::7e1e:52ff:fe79:177a/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 17 18:01:29.459251 waagent[1941]: 2025-03-17T18:01:29.459136Z INFO EnvHandler ExtHandler Successfully added Azure fabric firewall rules. Current Firewall rules: Mar 17 18:01:29.459251 waagent[1941]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:01:29.459251 waagent[1941]: pkts bytes target prot opt in out source destination Mar 17 18:01:29.459251 waagent[1941]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:01:29.459251 waagent[1941]: pkts bytes target prot opt in out source destination Mar 17 18:01:29.459251 waagent[1941]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:01:29.459251 waagent[1941]: pkts bytes target prot opt in out source destination Mar 17 18:01:29.459251 waagent[1941]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 18:01:29.459251 waagent[1941]: 10 1102 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 18:01:29.459251 waagent[1941]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 18:01:29.462849 waagent[1941]: 2025-03-17T18:01:29.462789Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 17 18:01:29.462849 waagent[1941]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:01:29.462849 waagent[1941]: pkts bytes target prot opt in out source destination Mar 17 18:01:29.462849 waagent[1941]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:01:29.462849 waagent[1941]: pkts bytes target prot opt in out source destination Mar 17 18:01:29.462849 waagent[1941]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:01:29.462849 waagent[1941]: pkts bytes target prot opt in out source destination Mar 17 18:01:29.462849 waagent[1941]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 18:01:29.462849 waagent[1941]: 12 1214 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 18:01:29.462849 waagent[1941]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 18:01:29.463246 waagent[1941]: 2025-03-17T18:01:29.463085Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 17 18:01:34.778333 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 18:01:34.783464 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:01:34.899818 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:01:34.904025 (kubelet)[2185]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 18:01:35.537660 kubelet[2185]: E0317 18:01:35.537557 2185 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:01:35.541291 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:01:35.541500 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:01:35.541888 systemd[1]: kubelet.service: Consumed 145ms CPU time, 104.7M memory peak. Mar 17 18:01:45.778407 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 18:01:45.785471 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:01:45.884633 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:01:45.888747 (kubelet)[2199]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 18:01:46.165002 chronyd[1722]: Selected source PHC0 Mar 17 18:01:46.538171 kubelet[2199]: E0317 18:01:46.538112 2199 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:01:46.540590 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:01:46.540865 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:01:46.541271 systemd[1]: kubelet.service: Consumed 139ms CPU time, 102M memory peak. Mar 17 18:01:56.778478 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 17 18:01:56.795441 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:01:57.001469 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:01:57.005316 (kubelet)[2215]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 18:01:57.484176 kubelet[2215]: E0317 18:01:57.484100 2215 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:01:57.486292 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:01:57.486499 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:01:57.486953 systemd[1]: kubelet.service: Consumed 136ms CPU time, 104.3M memory peak. Mar 17 18:02:04.330227 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Mar 17 18:02:07.346465 update_engine[1718]: I20250317 18:02:07.346356 1718 update_attempter.cc:509] Updating boot flags... Mar 17 18:02:07.397892 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 42 scanned by (udev-worker) (2238) Mar 17 18:02:07.501848 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 17 18:02:07.518458 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:02:07.585228 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 42 scanned by (udev-worker) (2241) Mar 17 18:02:08.224732 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:02:08.229067 (kubelet)[2345]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 18:02:08.264595 kubelet[2345]: E0317 18:02:08.264512 2345 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:02:08.266690 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:02:08.266897 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:02:08.267300 systemd[1]: kubelet.service: Consumed 148ms CPU time, 103.1M memory peak. Mar 17 18:02:18.278477 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 17 18:02:18.290425 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:02:18.652439 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:02:18.656507 (kubelet)[2360]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 18:02:19.051057 kubelet[2360]: E0317 18:02:19.051006 2360 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:02:19.053291 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:02:19.053493 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:02:19.053928 systemd[1]: kubelet.service: Consumed 147ms CPU time, 107.9M memory peak. Mar 17 18:02:24.220136 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 18:02:24.225509 systemd[1]: Started sshd@0-10.200.4.11:22-10.200.16.10:40956.service - OpenSSH per-connection server daemon (10.200.16.10:40956). Mar 17 18:02:25.038581 sshd[2367]: Accepted publickey for core from 10.200.16.10 port 40956 ssh2: RSA SHA256:sqIb/6ECvJ+NcGlXWH+RGh9zoFex64JsYEL5FCi862w Mar 17 18:02:25.040288 sshd-session[2367]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:02:25.045369 systemd-logind[1716]: New session 3 of user core. Mar 17 18:02:25.054541 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 18:02:25.596508 systemd[1]: Started sshd@1-10.200.4.11:22-10.200.16.10:40964.service - OpenSSH per-connection server daemon (10.200.16.10:40964). Mar 17 18:02:26.204776 sshd[2372]: Accepted publickey for core from 10.200.16.10 port 40964 ssh2: RSA SHA256:sqIb/6ECvJ+NcGlXWH+RGh9zoFex64JsYEL5FCi862w Mar 17 18:02:26.206440 sshd-session[2372]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:02:26.210790 systemd-logind[1716]: New session 4 of user core. Mar 17 18:02:26.217579 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 18:02:26.636771 sshd[2374]: Connection closed by 10.200.16.10 port 40964 Mar 17 18:02:26.637880 sshd-session[2372]: pam_unix(sshd:session): session closed for user core Mar 17 18:02:26.641290 systemd[1]: sshd@1-10.200.4.11:22-10.200.16.10:40964.service: Deactivated successfully. Mar 17 18:02:26.643566 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 18:02:26.644973 systemd-logind[1716]: Session 4 logged out. Waiting for processes to exit. Mar 17 18:02:26.645925 systemd-logind[1716]: Removed session 4. Mar 17 18:02:26.753494 systemd[1]: Started sshd@2-10.200.4.11:22-10.200.16.10:40974.service - OpenSSH per-connection server daemon (10.200.16.10:40974). Mar 17 18:02:27.366167 sshd[2380]: Accepted publickey for core from 10.200.16.10 port 40974 ssh2: RSA SHA256:sqIb/6ECvJ+NcGlXWH+RGh9zoFex64JsYEL5FCi862w Mar 17 18:02:27.367738 sshd-session[2380]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:02:27.372031 systemd-logind[1716]: New session 5 of user core. Mar 17 18:02:27.380351 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 18:02:27.815126 sshd[2382]: Connection closed by 10.200.16.10 port 40974 Mar 17 18:02:27.816285 sshd-session[2380]: pam_unix(sshd:session): session closed for user core Mar 17 18:02:27.820572 systemd[1]: sshd@2-10.200.4.11:22-10.200.16.10:40974.service: Deactivated successfully. Mar 17 18:02:27.822732 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 18:02:27.823671 systemd-logind[1716]: Session 5 logged out. Waiting for processes to exit. Mar 17 18:02:27.824733 systemd-logind[1716]: Removed session 5. Mar 17 18:02:27.927501 systemd[1]: Started sshd@3-10.200.4.11:22-10.200.16.10:40986.service - OpenSSH per-connection server daemon (10.200.16.10:40986). Mar 17 18:02:28.540583 sshd[2388]: Accepted publickey for core from 10.200.16.10 port 40986 ssh2: RSA SHA256:sqIb/6ECvJ+NcGlXWH+RGh9zoFex64JsYEL5FCi862w Mar 17 18:02:28.542180 sshd-session[2388]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:02:28.548182 systemd-logind[1716]: New session 6 of user core. Mar 17 18:02:28.558368 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 18:02:28.996073 sshd[2390]: Connection closed by 10.200.16.10 port 40986 Mar 17 18:02:28.999352 sshd-session[2388]: pam_unix(sshd:session): session closed for user core Mar 17 18:02:29.001869 systemd[1]: sshd@3-10.200.4.11:22-10.200.16.10:40986.service: Deactivated successfully. Mar 17 18:02:29.003807 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 18:02:29.005164 systemd-logind[1716]: Session 6 logged out. Waiting for processes to exit. Mar 17 18:02:29.006339 systemd-logind[1716]: Removed session 6. Mar 17 18:02:29.104172 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 17 18:02:29.115398 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:02:29.117565 systemd[1]: Started sshd@4-10.200.4.11:22-10.200.16.10:49850.service - OpenSSH per-connection server daemon (10.200.16.10:49850). Mar 17 18:02:29.469001 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:02:29.475581 (kubelet)[2406]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 18:02:29.513180 kubelet[2406]: E0317 18:02:29.513127 2406 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:02:29.515443 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:02:29.515639 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:02:29.516030 systemd[1]: kubelet.service: Consumed 143ms CPU time, 105.8M memory peak. Mar 17 18:02:29.734022 sshd[2397]: Accepted publickey for core from 10.200.16.10 port 49850 ssh2: RSA SHA256:sqIb/6ECvJ+NcGlXWH+RGh9zoFex64JsYEL5FCi862w Mar 17 18:02:29.735433 sshd-session[2397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:02:29.739863 systemd-logind[1716]: New session 7 of user core. Mar 17 18:02:29.747342 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 18:02:30.300829 sudo[2414]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 18:02:30.301173 sudo[2414]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 18:02:30.316596 sudo[2414]: pam_unix(sudo:session): session closed for user root Mar 17 18:02:30.413618 sshd[2413]: Connection closed by 10.200.16.10 port 49850 Mar 17 18:02:30.414656 sshd-session[2397]: pam_unix(sshd:session): session closed for user core Mar 17 18:02:30.419342 systemd[1]: sshd@4-10.200.4.11:22-10.200.16.10:49850.service: Deactivated successfully. Mar 17 18:02:30.421288 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 18:02:30.422038 systemd-logind[1716]: Session 7 logged out. Waiting for processes to exit. Mar 17 18:02:30.423076 systemd-logind[1716]: Removed session 7. Mar 17 18:02:30.532489 systemd[1]: Started sshd@5-10.200.4.11:22-10.200.16.10:49866.service - OpenSSH per-connection server daemon (10.200.16.10:49866). Mar 17 18:02:31.139955 sshd[2420]: Accepted publickey for core from 10.200.16.10 port 49866 ssh2: RSA SHA256:sqIb/6ECvJ+NcGlXWH+RGh9zoFex64JsYEL5FCi862w Mar 17 18:02:31.141648 sshd-session[2420]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:02:31.146939 systemd-logind[1716]: New session 8 of user core. Mar 17 18:02:31.152363 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 18:02:31.475152 sudo[2424]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 18:02:31.475535 sudo[2424]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 18:02:31.478808 sudo[2424]: pam_unix(sudo:session): session closed for user root Mar 17 18:02:31.483555 sudo[2423]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 18:02:31.483883 sudo[2423]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 18:02:31.496611 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 18:02:31.522130 augenrules[2446]: No rules Mar 17 18:02:31.523476 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 18:02:31.523716 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 18:02:31.524826 sudo[2423]: pam_unix(sudo:session): session closed for user root Mar 17 18:02:31.622144 sshd[2422]: Connection closed by 10.200.16.10 port 49866 Mar 17 18:02:31.622909 sshd-session[2420]: pam_unix(sshd:session): session closed for user core Mar 17 18:02:31.627546 systemd[1]: sshd@5-10.200.4.11:22-10.200.16.10:49866.service: Deactivated successfully. Mar 17 18:02:31.629736 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 18:02:31.630734 systemd-logind[1716]: Session 8 logged out. Waiting for processes to exit. Mar 17 18:02:31.631772 systemd-logind[1716]: Removed session 8. Mar 17 18:02:31.735503 systemd[1]: Started sshd@6-10.200.4.11:22-10.200.16.10:49882.service - OpenSSH per-connection server daemon (10.200.16.10:49882). Mar 17 18:02:32.343113 sshd[2455]: Accepted publickey for core from 10.200.16.10 port 49882 ssh2: RSA SHA256:sqIb/6ECvJ+NcGlXWH+RGh9zoFex64JsYEL5FCi862w Mar 17 18:02:32.344750 sshd-session[2455]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:02:32.349200 systemd-logind[1716]: New session 9 of user core. Mar 17 18:02:32.358340 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 18:02:32.678552 sudo[2458]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 18:02:32.678918 sudo[2458]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 18:02:33.212813 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:02:33.213100 systemd[1]: kubelet.service: Consumed 143ms CPU time, 105.8M memory peak. Mar 17 18:02:33.218698 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:02:33.253265 systemd[1]: Reload requested from client PID 2491 ('systemctl') (unit session-9.scope)... Mar 17 18:02:33.253289 systemd[1]: Reloading... Mar 17 18:02:33.354238 zram_generator::config[2535]: No configuration found. Mar 17 18:02:33.508364 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:02:33.622292 systemd[1]: Reloading finished in 368 ms. Mar 17 18:02:33.666000 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:02:33.672341 (kubelet)[2597]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 18:02:33.678536 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:02:33.679148 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 18:02:33.679420 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:02:33.679471 systemd[1]: kubelet.service: Consumed 119ms CPU time, 93.3M memory peak. Mar 17 18:02:33.684525 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 18:02:33.936108 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 18:02:33.947542 (kubelet)[2614]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 18:02:33.981493 kubelet[2614]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:02:33.981493 kubelet[2614]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 17 18:02:33.981493 kubelet[2614]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:02:33.981920 kubelet[2614]: I0317 18:02:33.981554 2614 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 18:02:34.617234 kubelet[2614]: I0317 18:02:34.616442 2614 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 17 18:02:34.617234 kubelet[2614]: I0317 18:02:34.616481 2614 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 18:02:34.617234 kubelet[2614]: I0317 18:02:34.616965 2614 server.go:954] "Client rotation is on, will bootstrap in background" Mar 17 18:02:34.645485 kubelet[2614]: I0317 18:02:34.645451 2614 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 18:02:34.656613 kubelet[2614]: E0317 18:02:34.656492 2614 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 17 18:02:34.656613 kubelet[2614]: I0317 18:02:34.656521 2614 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 17 18:02:34.660092 kubelet[2614]: I0317 18:02:34.660062 2614 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 18:02:34.662747 kubelet[2614]: I0317 18:02:34.662241 2614 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 18:02:34.662747 kubelet[2614]: I0317 18:02:34.662294 2614 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"10.200.4.11","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 18:02:34.662747 kubelet[2614]: I0317 18:02:34.662506 2614 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 18:02:34.662747 kubelet[2614]: I0317 18:02:34.662520 2614 container_manager_linux.go:304] "Creating device plugin manager" Mar 17 18:02:34.662981 kubelet[2614]: I0317 18:02:34.662643 2614 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:02:34.665712 kubelet[2614]: I0317 18:02:34.665691 2614 kubelet.go:446] "Attempting to sync node with API server" Mar 17 18:02:34.665712 kubelet[2614]: I0317 18:02:34.665712 2614 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 18:02:34.665831 kubelet[2614]: I0317 18:02:34.665733 2614 kubelet.go:352] "Adding apiserver pod source" Mar 17 18:02:34.665831 kubelet[2614]: I0317 18:02:34.665746 2614 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 18:02:34.666559 kubelet[2614]: E0317 18:02:34.666248 2614 file.go:98] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:34.666559 kubelet[2614]: E0317 18:02:34.666494 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:34.668244 kubelet[2614]: I0317 18:02:34.668225 2614 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 18:02:34.668639 kubelet[2614]: I0317 18:02:34.668623 2614 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 18:02:34.669150 kubelet[2614]: W0317 18:02:34.669129 2614 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 18:02:34.671001 kubelet[2614]: I0317 18:02:34.670971 2614 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 17 18:02:34.671075 kubelet[2614]: I0317 18:02:34.671013 2614 server.go:1287] "Started kubelet" Mar 17 18:02:34.671192 kubelet[2614]: I0317 18:02:34.671152 2614 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 18:02:34.672530 kubelet[2614]: I0317 18:02:34.672096 2614 server.go:490] "Adding debug handlers to kubelet server" Mar 17 18:02:34.674976 kubelet[2614]: I0317 18:02:34.674920 2614 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 18:02:34.675477 kubelet[2614]: I0317 18:02:34.675457 2614 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 18:02:34.676851 kubelet[2614]: I0317 18:02:34.676685 2614 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 18:02:34.682139 kubelet[2614]: E0317 18:02:34.680711 2614 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{10.200.4.11.182da918c2933563 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:10.200.4.11,UID:10.200.4.11,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:10.200.4.11,},FirstTimestamp:2025-03-17 18:02:34.670986595 +0000 UTC m=+0.720219470,LastTimestamp:2025-03-17 18:02:34.670986595 +0000 UTC m=+0.720219470,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:10.200.4.11,}" Mar 17 18:02:34.687090 kubelet[2614]: I0317 18:02:34.686054 2614 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 17 18:02:34.688891 kubelet[2614]: I0317 18:02:34.688873 2614 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 17 18:02:34.689146 kubelet[2614]: E0317 18:02:34.689126 2614 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"10.200.4.11\" not found" Mar 17 18:02:34.690029 kubelet[2614]: I0317 18:02:34.690008 2614 factory.go:221] Registration of the systemd container factory successfully Mar 17 18:02:34.690299 kubelet[2614]: I0317 18:02:34.690275 2614 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 18:02:34.693499 kubelet[2614]: I0317 18:02:34.693475 2614 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 18:02:34.693576 kubelet[2614]: I0317 18:02:34.693540 2614 reconciler.go:26] "Reconciler: start to sync state" Mar 17 18:02:34.696013 kubelet[2614]: E0317 18:02:34.695991 2614 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 18:02:34.696256 kubelet[2614]: I0317 18:02:34.696235 2614 factory.go:221] Registration of the containerd container factory successfully Mar 17 18:02:34.715258 kubelet[2614]: E0317 18:02:34.715160 2614 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"10.200.4.11\" not found" node="10.200.4.11" Mar 17 18:02:34.719340 kubelet[2614]: I0317 18:02:34.719270 2614 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 17 18:02:34.719340 kubelet[2614]: I0317 18:02:34.719283 2614 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 17 18:02:34.719340 kubelet[2614]: I0317 18:02:34.719305 2614 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:02:34.723864 kubelet[2614]: I0317 18:02:34.723758 2614 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 18:02:34.725398 kubelet[2614]: I0317 18:02:34.725384 2614 policy_none.go:49] "None policy: Start" Mar 17 18:02:34.725496 kubelet[2614]: I0317 18:02:34.725485 2614 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 17 18:02:34.725588 kubelet[2614]: I0317 18:02:34.725578 2614 state_mem.go:35] "Initializing new in-memory state store" Mar 17 18:02:34.726555 kubelet[2614]: I0317 18:02:34.726533 2614 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 18:02:34.726555 kubelet[2614]: I0317 18:02:34.726557 2614 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 17 18:02:34.726673 kubelet[2614]: I0317 18:02:34.726576 2614 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 17 18:02:34.726673 kubelet[2614]: I0317 18:02:34.726585 2614 kubelet.go:2388] "Starting kubelet main sync loop" Mar 17 18:02:34.726752 kubelet[2614]: E0317 18:02:34.726691 2614 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 18:02:34.737017 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 17 18:02:34.746945 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 17 18:02:34.756955 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 17 18:02:34.758646 kubelet[2614]: I0317 18:02:34.758404 2614 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 18:02:34.758716 kubelet[2614]: I0317 18:02:34.758657 2614 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 18:02:34.758716 kubelet[2614]: I0317 18:02:34.758670 2614 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 18:02:34.759310 kubelet[2614]: I0317 18:02:34.759021 2614 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 18:02:34.760538 kubelet[2614]: E0317 18:02:34.760517 2614 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 17 18:02:34.760636 kubelet[2614]: E0317 18:02:34.760559 2614 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"10.200.4.11\" not found" Mar 17 18:02:34.860250 kubelet[2614]: I0317 18:02:34.860216 2614 kubelet_node_status.go:76] "Attempting to register node" node="10.200.4.11" Mar 17 18:02:34.868799 kubelet[2614]: I0317 18:02:34.868671 2614 kubelet_node_status.go:79] "Successfully registered node" node="10.200.4.11" Mar 17 18:02:34.967998 sudo[2458]: pam_unix(sudo:session): session closed for user root Mar 17 18:02:34.973006 kubelet[2614]: I0317 18:02:34.972973 2614 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.1.0/24" Mar 17 18:02:34.973443 containerd[1738]: time="2025-03-17T18:02:34.973395430Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 18:02:34.974004 kubelet[2614]: I0317 18:02:34.973634 2614 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.1.0/24" Mar 17 18:02:35.064883 sshd[2457]: Connection closed by 10.200.16.10 port 49882 Mar 17 18:02:35.065704 sshd-session[2455]: pam_unix(sshd:session): session closed for user core Mar 17 18:02:35.068986 systemd[1]: sshd@6-10.200.4.11:22-10.200.16.10:49882.service: Deactivated successfully. Mar 17 18:02:35.071155 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 18:02:35.071392 systemd[1]: session-9.scope: Consumed 431ms CPU time, 75.5M memory peak. Mar 17 18:02:35.073636 systemd-logind[1716]: Session 9 logged out. Waiting for processes to exit. Mar 17 18:02:35.075363 systemd-logind[1716]: Removed session 9. Mar 17 18:02:35.623420 kubelet[2614]: I0317 18:02:35.623354 2614 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 17 18:02:35.623962 kubelet[2614]: W0317 18:02:35.623627 2614 reflector.go:492] k8s.io/client-go/informers/factory.go:160: watch of *v1.CSIDriver ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 17 18:02:35.623962 kubelet[2614]: W0317 18:02:35.623675 2614 reflector.go:492] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 17 18:02:35.624139 kubelet[2614]: W0317 18:02:35.624107 2614 reflector.go:492] k8s.io/client-go/informers/factory.go:160: watch of *v1.RuntimeClass ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Mar 17 18:02:35.666807 kubelet[2614]: I0317 18:02:35.666777 2614 apiserver.go:52] "Watching apiserver" Mar 17 18:02:35.666962 kubelet[2614]: E0317 18:02:35.666773 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:35.670654 kubelet[2614]: E0317 18:02:35.670589 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:35.678807 systemd[1]: Created slice kubepods-besteffort-pod714c58ad_c4b0_401c_9d1f_4e806d24476d.slice - libcontainer container kubepods-besteffort-pod714c58ad_c4b0_401c_9d1f_4e806d24476d.slice. Mar 17 18:02:35.694100 kubelet[2614]: I0317 18:02:35.693909 2614 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 18:02:35.694007 systemd[1]: Created slice kubepods-besteffort-pod08c7d3c4_1287_4d0c_b98b_c036d1e808d2.slice - libcontainer container kubepods-besteffort-pod08c7d3c4_1287_4d0c_b98b_c036d1e808d2.slice. Mar 17 18:02:35.700862 kubelet[2614]: I0317 18:02:35.700827 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxdvl\" (UniqueName: \"kubernetes.io/projected/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-kube-api-access-lxdvl\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.700862 kubelet[2614]: I0317 18:02:35.700860 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82ad5d43-e4e0-4c4f-8306-eb61af926aaf-kubelet-dir\") pod \"csi-node-driver-dqftv\" (UID: \"82ad5d43-e4e0-4c4f-8306-eb61af926aaf\") " pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:35.701115 kubelet[2614]: I0317 18:02:35.700885 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlqjt\" (UniqueName: \"kubernetes.io/projected/82ad5d43-e4e0-4c4f-8306-eb61af926aaf-kube-api-access-dlqjt\") pod \"csi-node-driver-dqftv\" (UID: \"82ad5d43-e4e0-4c4f-8306-eb61af926aaf\") " pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:35.701115 kubelet[2614]: I0317 18:02:35.700906 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gll\" (UniqueName: \"kubernetes.io/projected/714c58ad-c4b0-401c-9d1f-4e806d24476d-kube-api-access-j4gll\") pod \"kube-proxy-4k5bl\" (UID: \"714c58ad-c4b0-401c-9d1f-4e806d24476d\") " pod="kube-system/kube-proxy-4k5bl" Mar 17 18:02:35.701115 kubelet[2614]: I0317 18:02:35.700929 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-policysync\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701115 kubelet[2614]: I0317 18:02:35.700949 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-flexvol-driver-host\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701115 kubelet[2614]: I0317 18:02:35.700971 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82ad5d43-e4e0-4c4f-8306-eb61af926aaf-socket-dir\") pod \"csi-node-driver-dqftv\" (UID: \"82ad5d43-e4e0-4c4f-8306-eb61af926aaf\") " pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:35.701361 kubelet[2614]: I0317 18:02:35.700991 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82ad5d43-e4e0-4c4f-8306-eb61af926aaf-registration-dir\") pod \"csi-node-driver-dqftv\" (UID: \"82ad5d43-e4e0-4c4f-8306-eb61af926aaf\") " pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:35.701361 kubelet[2614]: I0317 18:02:35.701014 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/714c58ad-c4b0-401c-9d1f-4e806d24476d-xtables-lock\") pod \"kube-proxy-4k5bl\" (UID: \"714c58ad-c4b0-401c-9d1f-4e806d24476d\") " pod="kube-system/kube-proxy-4k5bl" Mar 17 18:02:35.701361 kubelet[2614]: I0317 18:02:35.701034 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-cni-bin-dir\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701361 kubelet[2614]: I0317 18:02:35.701055 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-cni-log-dir\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701361 kubelet[2614]: I0317 18:02:35.701075 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/82ad5d43-e4e0-4c4f-8306-eb61af926aaf-varrun\") pod \"csi-node-driver-dqftv\" (UID: \"82ad5d43-e4e0-4c4f-8306-eb61af926aaf\") " pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:35.701550 kubelet[2614]: I0317 18:02:35.701097 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/714c58ad-c4b0-401c-9d1f-4e806d24476d-lib-modules\") pod \"kube-proxy-4k5bl\" (UID: \"714c58ad-c4b0-401c-9d1f-4e806d24476d\") " pod="kube-system/kube-proxy-4k5bl" Mar 17 18:02:35.701550 kubelet[2614]: I0317 18:02:35.701119 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-var-run-calico\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701550 kubelet[2614]: I0317 18:02:35.701144 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-var-lib-calico\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701550 kubelet[2614]: I0317 18:02:35.701165 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-cni-net-dir\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701550 kubelet[2614]: I0317 18:02:35.701186 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/714c58ad-c4b0-401c-9d1f-4e806d24476d-kube-proxy\") pod \"kube-proxy-4k5bl\" (UID: \"714c58ad-c4b0-401c-9d1f-4e806d24476d\") " pod="kube-system/kube-proxy-4k5bl" Mar 17 18:02:35.701729 kubelet[2614]: I0317 18:02:35.701226 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-lib-modules\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701729 kubelet[2614]: I0317 18:02:35.701248 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-xtables-lock\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701729 kubelet[2614]: I0317 18:02:35.701269 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-tigera-ca-bundle\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.701729 kubelet[2614]: I0317 18:02:35.701291 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/08c7d3c4-1287-4d0c-b98b-c036d1e808d2-node-certs\") pod \"calico-node-9hhzz\" (UID: \"08c7d3c4-1287-4d0c-b98b-c036d1e808d2\") " pod="calico-system/calico-node-9hhzz" Mar 17 18:02:35.805007 kubelet[2614]: E0317 18:02:35.804350 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.805007 kubelet[2614]: W0317 18:02:35.804373 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.805007 kubelet[2614]: E0317 18:02:35.804396 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.805007 kubelet[2614]: E0317 18:02:35.804615 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.805007 kubelet[2614]: W0317 18:02:35.804626 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.805007 kubelet[2614]: E0317 18:02:35.804644 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.805007 kubelet[2614]: E0317 18:02:35.804823 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.805007 kubelet[2614]: W0317 18:02:35.804834 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.805007 kubelet[2614]: E0317 18:02:35.804858 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.806624 kubelet[2614]: E0317 18:02:35.806388 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.806624 kubelet[2614]: W0317 18:02:35.806404 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.806624 kubelet[2614]: E0317 18:02:35.806433 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.808377 kubelet[2614]: E0317 18:02:35.808320 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.808377 kubelet[2614]: W0317 18:02:35.808336 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.808377 kubelet[2614]: E0317 18:02:35.808355 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.809414 kubelet[2614]: E0317 18:02:35.809393 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.809414 kubelet[2614]: W0317 18:02:35.809411 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.809540 kubelet[2614]: E0317 18:02:35.809434 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.811128 kubelet[2614]: E0317 18:02:35.811103 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.811128 kubelet[2614]: W0317 18:02:35.811123 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.811276 kubelet[2614]: E0317 18:02:35.811139 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.813229 kubelet[2614]: E0317 18:02:35.811331 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.813229 kubelet[2614]: W0317 18:02:35.811344 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.813229 kubelet[2614]: E0317 18:02:35.811357 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.813229 kubelet[2614]: E0317 18:02:35.811572 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.813229 kubelet[2614]: W0317 18:02:35.811583 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.813229 kubelet[2614]: E0317 18:02:35.811596 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.815492 kubelet[2614]: E0317 18:02:35.815477 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.815651 kubelet[2614]: W0317 18:02:35.815634 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.815746 kubelet[2614]: E0317 18:02:35.815734 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.821636 kubelet[2614]: E0317 18:02:35.821620 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.821759 kubelet[2614]: W0317 18:02:35.821745 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.821931 kubelet[2614]: E0317 18:02:35.821903 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.822242 kubelet[2614]: E0317 18:02:35.822200 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.822341 kubelet[2614]: W0317 18:02:35.822329 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.822527 kubelet[2614]: E0317 18:02:35.822511 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.822921 kubelet[2614]: E0317 18:02:35.822906 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.823025 kubelet[2614]: W0317 18:02:35.823013 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.823187 kubelet[2614]: E0317 18:02:35.823174 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.823771 kubelet[2614]: E0317 18:02:35.823736 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.824083 kubelet[2614]: W0317 18:02:35.823902 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.824083 kubelet[2614]: E0317 18:02:35.823987 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.824559 kubelet[2614]: E0317 18:02:35.824471 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.824559 kubelet[2614]: W0317 18:02:35.824501 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.826305 kubelet[2614]: E0317 18:02:35.826254 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.830184 kubelet[2614]: E0317 18:02:35.830062 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.830184 kubelet[2614]: W0317 18:02:35.830077 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.830184 kubelet[2614]: E0317 18:02:35.830102 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.831978 kubelet[2614]: E0317 18:02:35.830673 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.831978 kubelet[2614]: W0317 18:02:35.830687 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.831978 kubelet[2614]: E0317 18:02:35.830846 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.835301 kubelet[2614]: E0317 18:02:35.835280 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.835301 kubelet[2614]: W0317 18:02:35.835299 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.835515 kubelet[2614]: E0317 18:02:35.835500 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.835596 kubelet[2614]: W0317 18:02:35.835529 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.835596 kubelet[2614]: E0317 18:02:35.835502 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.835683 kubelet[2614]: E0317 18:02:35.835595 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.835884 kubelet[2614]: E0317 18:02:35.835872 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.836024 kubelet[2614]: W0317 18:02:35.835943 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.836126 kubelet[2614]: E0317 18:02:35.836092 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.836272 kubelet[2614]: E0317 18:02:35.836261 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.836413 kubelet[2614]: W0317 18:02:35.836329 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.836413 kubelet[2614]: E0317 18:02:35.836361 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.836690 kubelet[2614]: E0317 18:02:35.836669 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.836690 kubelet[2614]: W0317 18:02:35.836686 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.837768 kubelet[2614]: E0317 18:02:35.836795 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.837768 kubelet[2614]: E0317 18:02:35.836928 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.837768 kubelet[2614]: W0317 18:02:35.836938 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.837768 kubelet[2614]: E0317 18:02:35.837045 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.837768 kubelet[2614]: E0317 18:02:35.837177 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.837768 kubelet[2614]: W0317 18:02:35.837187 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.837768 kubelet[2614]: E0317 18:02:35.837236 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.837768 kubelet[2614]: E0317 18:02:35.837446 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:35.837768 kubelet[2614]: W0317 18:02:35.837455 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:35.837768 kubelet[2614]: E0317 18:02:35.837467 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:35.991612 containerd[1738]: time="2025-03-17T18:02:35.991470313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4k5bl,Uid:714c58ad-c4b0-401c-9d1f-4e806d24476d,Namespace:kube-system,Attempt:0,}" Mar 17 18:02:35.997241 containerd[1738]: time="2025-03-17T18:02:35.997189990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9hhzz,Uid:08c7d3c4-1287-4d0c-b98b-c036d1e808d2,Namespace:calico-system,Attempt:0,}" Mar 17 18:02:36.667307 kubelet[2614]: E0317 18:02:36.667271 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:36.726563 containerd[1738]: time="2025-03-17T18:02:36.726513620Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 18:02:36.731637 containerd[1738]: time="2025-03-17T18:02:36.731600288Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Mar 17 18:02:36.733518 containerd[1738]: time="2025-03-17T18:02:36.733485113Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 18:02:36.738962 containerd[1738]: time="2025-03-17T18:02:36.738933186Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 18:02:36.740925 containerd[1738]: time="2025-03-17T18:02:36.740891912Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 18:02:36.745028 containerd[1738]: time="2025-03-17T18:02:36.744981967Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 18:02:36.746300 containerd[1738]: time="2025-03-17T18:02:36.745747777Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 754.147862ms" Mar 17 18:02:36.750235 containerd[1738]: time="2025-03-17T18:02:36.750184436Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 752.841144ms" Mar 17 18:02:36.812378 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3022938054.mount: Deactivated successfully. Mar 17 18:02:37.567895 containerd[1738]: time="2025-03-17T18:02:37.567675243Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:02:37.567895 containerd[1738]: time="2025-03-17T18:02:37.567744244Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:02:37.567895 containerd[1738]: time="2025-03-17T18:02:37.567769045Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:02:37.568631 containerd[1738]: time="2025-03-17T18:02:37.567960947Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:02:37.569858 containerd[1738]: time="2025-03-17T18:02:37.569115463Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:02:37.569858 containerd[1738]: time="2025-03-17T18:02:37.569182563Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:02:37.570289 containerd[1738]: time="2025-03-17T18:02:37.570234777Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:02:37.570524 containerd[1738]: time="2025-03-17T18:02:37.570470081Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:02:37.668344 kubelet[2614]: E0317 18:02:37.668305 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:37.727229 kubelet[2614]: E0317 18:02:37.727149 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:37.996427 systemd[1]: Started cri-containerd-3643c772132ff3ab3b72bfe9991aff9431643284a1fb15ce0985f16c7657dbff.scope - libcontainer container 3643c772132ff3ab3b72bfe9991aff9431643284a1fb15ce0985f16c7657dbff. Mar 17 18:02:37.998550 systemd[1]: Started cri-containerd-3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb.scope - libcontainer container 3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb. Mar 17 18:02:38.030788 containerd[1738]: time="2025-03-17T18:02:38.030657320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4k5bl,Uid:714c58ad-c4b0-401c-9d1f-4e806d24476d,Namespace:kube-system,Attempt:0,} returns sandbox id \"3643c772132ff3ab3b72bfe9991aff9431643284a1fb15ce0985f16c7657dbff\"" Mar 17 18:02:38.034682 containerd[1738]: time="2025-03-17T18:02:38.034422071Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 17 18:02:38.035409 containerd[1738]: time="2025-03-17T18:02:38.035351283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9hhzz,Uid:08c7d3c4-1287-4d0c-b98b-c036d1e808d2,Namespace:calico-system,Attempt:0,} returns sandbox id \"3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb\"" Mar 17 18:02:38.670235 kubelet[2614]: E0317 18:02:38.669079 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:39.132445 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1194507023.mount: Deactivated successfully. Mar 17 18:02:39.648853 containerd[1738]: time="2025-03-17T18:02:39.648808210Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:39.652341 containerd[1738]: time="2025-03-17T18:02:39.652295557Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=30918193" Mar 17 18:02:39.655123 containerd[1738]: time="2025-03-17T18:02:39.655073694Z" level=info msg="ImageCreate event name:\"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:39.658328 containerd[1738]: time="2025-03-17T18:02:39.658272436Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:39.659115 containerd[1738]: time="2025-03-17T18:02:39.658940945Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"30917204\" in 1.624469074s" Mar 17 18:02:39.659115 containerd[1738]: time="2025-03-17T18:02:39.658978046Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\"" Mar 17 18:02:39.660102 containerd[1738]: time="2025-03-17T18:02:39.660054360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 18:02:39.661357 containerd[1738]: time="2025-03-17T18:02:39.661327777Z" level=info msg="CreateContainer within sandbox \"3643c772132ff3ab3b72bfe9991aff9431643284a1fb15ce0985f16c7657dbff\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 18:02:39.669810 kubelet[2614]: E0317 18:02:39.669778 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:39.698822 containerd[1738]: time="2025-03-17T18:02:39.698786077Z" level=info msg="CreateContainer within sandbox \"3643c772132ff3ab3b72bfe9991aff9431643284a1fb15ce0985f16c7657dbff\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"42212621af23afa43d757549a251197be07abd12e1ec5a6626ae411d2058f7f1\"" Mar 17 18:02:39.699470 containerd[1738]: time="2025-03-17T18:02:39.699411885Z" level=info msg="StartContainer for \"42212621af23afa43d757549a251197be07abd12e1ec5a6626ae411d2058f7f1\"" Mar 17 18:02:39.727788 kubelet[2614]: E0317 18:02:39.727428 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:39.732347 systemd[1]: Started cri-containerd-42212621af23afa43d757549a251197be07abd12e1ec5a6626ae411d2058f7f1.scope - libcontainer container 42212621af23afa43d757549a251197be07abd12e1ec5a6626ae411d2058f7f1. Mar 17 18:02:39.762809 containerd[1738]: time="2025-03-17T18:02:39.762696930Z" level=info msg="StartContainer for \"42212621af23afa43d757549a251197be07abd12e1ec5a6626ae411d2058f7f1\" returns successfully" Mar 17 18:02:40.670412 kubelet[2614]: E0317 18:02:40.670362 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:40.829914 kubelet[2614]: E0317 18:02:40.829873 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.829914 kubelet[2614]: W0317 18:02:40.829904 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.830581 kubelet[2614]: E0317 18:02:40.829931 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.830581 kubelet[2614]: E0317 18:02:40.830251 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.830581 kubelet[2614]: W0317 18:02:40.830268 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.830581 kubelet[2614]: E0317 18:02:40.830289 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.830581 kubelet[2614]: E0317 18:02:40.830544 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.830581 kubelet[2614]: W0317 18:02:40.830557 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.830581 kubelet[2614]: E0317 18:02:40.830574 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.830874 kubelet[2614]: E0317 18:02:40.830809 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.830874 kubelet[2614]: W0317 18:02:40.830821 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.830874 kubelet[2614]: E0317 18:02:40.830833 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.831048 kubelet[2614]: E0317 18:02:40.831005 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.831048 kubelet[2614]: W0317 18:02:40.831020 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.831048 kubelet[2614]: E0317 18:02:40.831033 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.831377 kubelet[2614]: E0317 18:02:40.831226 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.831377 kubelet[2614]: W0317 18:02:40.831237 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.831377 kubelet[2614]: E0317 18:02:40.831250 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.831598 kubelet[2614]: E0317 18:02:40.831443 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.831598 kubelet[2614]: W0317 18:02:40.831454 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.831598 kubelet[2614]: E0317 18:02:40.831467 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.831805 kubelet[2614]: E0317 18:02:40.831655 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.831805 kubelet[2614]: W0317 18:02:40.831665 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.831805 kubelet[2614]: E0317 18:02:40.831677 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.832039 kubelet[2614]: E0317 18:02:40.831865 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.832039 kubelet[2614]: W0317 18:02:40.831875 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.832039 kubelet[2614]: E0317 18:02:40.831887 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.832279 kubelet[2614]: E0317 18:02:40.832060 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.832279 kubelet[2614]: W0317 18:02:40.832072 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.832279 kubelet[2614]: E0317 18:02:40.832084 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.832279 kubelet[2614]: E0317 18:02:40.832274 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.832553 kubelet[2614]: W0317 18:02:40.832285 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.832553 kubelet[2614]: E0317 18:02:40.832299 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.832553 kubelet[2614]: E0317 18:02:40.832516 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.832553 kubelet[2614]: W0317 18:02:40.832526 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.832553 kubelet[2614]: E0317 18:02:40.832538 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.832885 kubelet[2614]: E0317 18:02:40.832732 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.832885 kubelet[2614]: W0317 18:02:40.832742 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.832885 kubelet[2614]: E0317 18:02:40.832753 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.833105 kubelet[2614]: E0317 18:02:40.832928 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.833105 kubelet[2614]: W0317 18:02:40.832938 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.833105 kubelet[2614]: E0317 18:02:40.832949 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.833333 kubelet[2614]: E0317 18:02:40.833122 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.833333 kubelet[2614]: W0317 18:02:40.833133 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.833333 kubelet[2614]: E0317 18:02:40.833145 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.833551 kubelet[2614]: E0317 18:02:40.833336 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.833551 kubelet[2614]: W0317 18:02:40.833346 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.833551 kubelet[2614]: E0317 18:02:40.833358 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.833551 kubelet[2614]: E0317 18:02:40.833545 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.833991 kubelet[2614]: W0317 18:02:40.833555 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.833991 kubelet[2614]: E0317 18:02:40.833566 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.833991 kubelet[2614]: E0317 18:02:40.833767 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.833991 kubelet[2614]: W0317 18:02:40.833778 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.833991 kubelet[2614]: E0317 18:02:40.833793 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.833991 kubelet[2614]: E0317 18:02:40.833978 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.833991 kubelet[2614]: W0317 18:02:40.833989 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.834345 kubelet[2614]: E0317 18:02:40.834001 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.834345 kubelet[2614]: E0317 18:02:40.834178 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.834345 kubelet[2614]: W0317 18:02:40.834188 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.834345 kubelet[2614]: E0317 18:02:40.834199 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.835520 kubelet[2614]: E0317 18:02:40.835500 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.835520 kubelet[2614]: W0317 18:02:40.835515 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.836899 kubelet[2614]: E0317 18:02:40.835529 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.836899 kubelet[2614]: E0317 18:02:40.835740 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.836899 kubelet[2614]: W0317 18:02:40.835749 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.836899 kubelet[2614]: E0317 18:02:40.835764 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.836899 kubelet[2614]: E0317 18:02:40.835957 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.836899 kubelet[2614]: W0317 18:02:40.835966 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.836899 kubelet[2614]: E0317 18:02:40.835979 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.836899 kubelet[2614]: E0317 18:02:40.836144 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.836899 kubelet[2614]: W0317 18:02:40.836153 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.836899 kubelet[2614]: E0317 18:02:40.836167 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.837500 kubelet[2614]: E0317 18:02:40.836350 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.837500 kubelet[2614]: W0317 18:02:40.836359 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.837500 kubelet[2614]: E0317 18:02:40.836371 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.837500 kubelet[2614]: E0317 18:02:40.836561 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.837500 kubelet[2614]: W0317 18:02:40.836571 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.837500 kubelet[2614]: E0317 18:02:40.836586 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.837500 kubelet[2614]: E0317 18:02:40.836966 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.837500 kubelet[2614]: W0317 18:02:40.836978 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.837500 kubelet[2614]: E0317 18:02:40.836999 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.837500 kubelet[2614]: E0317 18:02:40.837190 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.837917 kubelet[2614]: W0317 18:02:40.837201 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.837917 kubelet[2614]: E0317 18:02:40.837233 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.837917 kubelet[2614]: E0317 18:02:40.837423 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.837917 kubelet[2614]: W0317 18:02:40.837434 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.837917 kubelet[2614]: E0317 18:02:40.837451 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.837917 kubelet[2614]: E0317 18:02:40.837685 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.837917 kubelet[2614]: W0317 18:02:40.837696 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.837917 kubelet[2614]: E0317 18:02:40.837708 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.838265 kubelet[2614]: E0317 18:02:40.838051 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.838265 kubelet[2614]: W0317 18:02:40.838061 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.838265 kubelet[2614]: E0317 18:02:40.838073 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:40.838388 kubelet[2614]: E0317 18:02:40.838299 2614 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:02:40.838388 kubelet[2614]: W0317 18:02:40.838310 2614 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:02:40.838388 kubelet[2614]: E0317 18:02:40.838323 2614 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:02:41.079644 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount788416786.mount: Deactivated successfully. Mar 17 18:02:41.212522 containerd[1738]: time="2025-03-17T18:02:41.212473573Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:41.214594 containerd[1738]: time="2025-03-17T18:02:41.214543100Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=6857253" Mar 17 18:02:41.217663 containerd[1738]: time="2025-03-17T18:02:41.217614341Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:41.221682 containerd[1738]: time="2025-03-17T18:02:41.221634895Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:41.222352 containerd[1738]: time="2025-03-17T18:02:41.222193103Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.562103541s" Mar 17 18:02:41.222352 containerd[1738]: time="2025-03-17T18:02:41.222247103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 17 18:02:41.224371 containerd[1738]: time="2025-03-17T18:02:41.224340131Z" level=info msg="CreateContainer within sandbox \"3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 18:02:41.260550 containerd[1738]: time="2025-03-17T18:02:41.260515014Z" level=info msg="CreateContainer within sandbox \"3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6\"" Mar 17 18:02:41.261022 containerd[1738]: time="2025-03-17T18:02:41.260993420Z" level=info msg="StartContainer for \"7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6\"" Mar 17 18:02:41.295366 systemd[1]: Started cri-containerd-7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6.scope - libcontainer container 7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6. Mar 17 18:02:41.322528 containerd[1738]: time="2025-03-17T18:02:41.322433340Z" level=info msg="StartContainer for \"7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6\" returns successfully" Mar 17 18:02:41.330120 systemd[1]: cri-containerd-7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6.scope: Deactivated successfully. Mar 17 18:02:41.671255 kubelet[2614]: E0317 18:02:41.670977 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:41.727517 kubelet[2614]: E0317 18:02:41.727432 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:41.774019 kubelet[2614]: I0317 18:02:41.773952 2614 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4k5bl" podStartSLOduration=6.147766268 podStartE2EDuration="7.773932764s" podCreationTimestamp="2025-03-17 18:02:34 +0000 UTC" firstStartedPulling="2025-03-17 18:02:38.033678861 +0000 UTC m=+4.082911736" lastFinishedPulling="2025-03-17 18:02:39.659845357 +0000 UTC m=+5.709078232" observedRunningTime="2025-03-17 18:02:40.767345234 +0000 UTC m=+6.816578109" watchObservedRunningTime="2025-03-17 18:02:41.773932764 +0000 UTC m=+7.823165739" Mar 17 18:02:42.045086 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6-rootfs.mount: Deactivated successfully. Mar 17 18:02:42.269354 containerd[1738]: time="2025-03-17T18:02:42.269275839Z" level=info msg="shim disconnected" id=7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6 namespace=k8s.io Mar 17 18:02:42.269354 containerd[1738]: time="2025-03-17T18:02:42.269352941Z" level=warning msg="cleaning up after shim disconnected" id=7a1de2fd1eae7af80de8b8b59b4020ae786495a9839b6c307841103956ade4c6 namespace=k8s.io Mar 17 18:02:42.269354 containerd[1738]: time="2025-03-17T18:02:42.269366541Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:02:42.672040 kubelet[2614]: E0317 18:02:42.671978 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:42.763276 containerd[1738]: time="2025-03-17T18:02:42.763194686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 18:02:43.672798 kubelet[2614]: E0317 18:02:43.672701 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:43.727583 kubelet[2614]: E0317 18:02:43.727522 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:44.673306 kubelet[2614]: E0317 18:02:44.673233 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:45.673667 kubelet[2614]: E0317 18:02:45.673612 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:45.727123 kubelet[2614]: E0317 18:02:45.727045 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:46.631959 containerd[1738]: time="2025-03-17T18:02:46.631912846Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:46.633922 containerd[1738]: time="2025-03-17T18:02:46.633767782Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 17 18:02:46.637384 containerd[1738]: time="2025-03-17T18:02:46.637333552Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:46.640555 containerd[1738]: time="2025-03-17T18:02:46.640506314Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:46.641271 containerd[1738]: time="2025-03-17T18:02:46.641098125Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 3.877820538s" Mar 17 18:02:46.641271 containerd[1738]: time="2025-03-17T18:02:46.641132526Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 17 18:02:46.643351 containerd[1738]: time="2025-03-17T18:02:46.643316268Z" level=info msg="CreateContainer within sandbox \"3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 18:02:46.673771 kubelet[2614]: E0317 18:02:46.673745 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:46.675328 containerd[1738]: time="2025-03-17T18:02:46.675298093Z" level=info msg="CreateContainer within sandbox \"3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d\"" Mar 17 18:02:46.675783 containerd[1738]: time="2025-03-17T18:02:46.675728101Z" level=info msg="StartContainer for \"59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d\"" Mar 17 18:02:46.711355 systemd[1]: Started cri-containerd-59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d.scope - libcontainer container 59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d. Mar 17 18:02:46.740919 containerd[1738]: time="2025-03-17T18:02:46.740867974Z" level=info msg="StartContainer for \"59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d\" returns successfully" Mar 17 18:02:47.674715 kubelet[2614]: E0317 18:02:47.674658 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:47.726972 kubelet[2614]: E0317 18:02:47.726900 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:48.131644 containerd[1738]: time="2025-03-17T18:02:48.131587238Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 18:02:48.133608 systemd[1]: cri-containerd-59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d.scope: Deactivated successfully. Mar 17 18:02:48.134454 systemd[1]: cri-containerd-59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d.scope: Consumed 433ms CPU time, 172.1M memory peak, 154M written to disk. Mar 17 18:02:48.155843 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d-rootfs.mount: Deactivated successfully. Mar 17 18:02:48.204605 kubelet[2614]: I0317 18:02:48.204323 2614 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 17 18:02:48.675103 kubelet[2614]: E0317 18:02:48.675030 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:49.675819 kubelet[2614]: E0317 18:02:49.675748 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:49.733347 systemd[1]: Created slice kubepods-besteffort-pod82ad5d43_e4e0_4c4f_8306_eb61af926aaf.slice - libcontainer container kubepods-besteffort-pod82ad5d43_e4e0_4c4f_8306_eb61af926aaf.slice. Mar 17 18:02:49.736129 containerd[1738]: time="2025-03-17T18:02:49.736082351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:0,}" Mar 17 18:02:49.826852 containerd[1738]: time="2025-03-17T18:02:49.826784307Z" level=info msg="shim disconnected" id=59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d namespace=k8s.io Mar 17 18:02:49.826852 containerd[1738]: time="2025-03-17T18:02:49.826847908Z" level=warning msg="cleaning up after shim disconnected" id=59fca1b9185896c4ee8fb1f17b3ee31de98afdbe6f8259e683c32abd634ee13d namespace=k8s.io Mar 17 18:02:49.826852 containerd[1738]: time="2025-03-17T18:02:49.826858608Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:02:49.896645 containerd[1738]: time="2025-03-17T18:02:49.896575373Z" level=error msg="Failed to destroy network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:49.899106 containerd[1738]: time="2025-03-17T18:02:49.896913078Z" level=error msg="encountered an error cleaning up failed sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:49.899106 containerd[1738]: time="2025-03-17T18:02:49.896998179Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:49.899342 kubelet[2614]: E0317 18:02:49.899283 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:49.899419 kubelet[2614]: E0317 18:02:49.899363 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:49.899419 kubelet[2614]: E0317 18:02:49.899394 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:49.899871 kubelet[2614]: E0317 18:02:49.899458 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:49.899578 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22-shm.mount: Deactivated successfully. Mar 17 18:02:49.929907 systemd[1]: Created slice kubepods-besteffort-pod7c8e6616_65fb_4a59_beb7_79a79cf32e0c.slice - libcontainer container kubepods-besteffort-pod7c8e6616_65fb_4a59_beb7_79a79cf32e0c.slice. Mar 17 18:02:49.989455 kubelet[2614]: I0317 18:02:49.989413 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbfp4\" (UniqueName: \"kubernetes.io/projected/7c8e6616-65fb-4a59-beb7-79a79cf32e0c-kube-api-access-hbfp4\") pod \"nginx-deployment-7fcdb87857-lzds7\" (UID: \"7c8e6616-65fb-4a59-beb7-79a79cf32e0c\") " pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:50.233587 containerd[1738]: time="2025-03-17T18:02:50.233447437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:0,}" Mar 17 18:02:50.310475 containerd[1738]: time="2025-03-17T18:02:50.310422290Z" level=error msg="Failed to destroy network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.310761 containerd[1738]: time="2025-03-17T18:02:50.310727596Z" level=error msg="encountered an error cleaning up failed sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.310841 containerd[1738]: time="2025-03-17T18:02:50.310795898Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:0,} failed, error" error="failed to setup network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.311068 kubelet[2614]: E0317 18:02:50.311031 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.311149 kubelet[2614]: E0317 18:02:50.311099 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:50.311149 kubelet[2614]: E0317 18:02:50.311128 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:50.311250 kubelet[2614]: E0317 18:02:50.311188 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:50.676687 kubelet[2614]: E0317 18:02:50.676588 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:50.778075 kubelet[2614]: I0317 18:02:50.778038 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0" Mar 17 18:02:50.779690 containerd[1738]: time="2025-03-17T18:02:50.779222929Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:50.779690 containerd[1738]: time="2025-03-17T18:02:50.779504935Z" level=info msg="Ensure that sandbox 616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0 in task-service has been cleanup successfully" Mar 17 18:02:50.780227 kubelet[2614]: I0317 18:02:50.779575 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22" Mar 17 18:02:50.780840 containerd[1738]: time="2025-03-17T18:02:50.780293251Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:50.780840 containerd[1738]: time="2025-03-17T18:02:50.780371453Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:50.780840 containerd[1738]: time="2025-03-17T18:02:50.780398453Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:50.780840 containerd[1738]: time="2025-03-17T18:02:50.780539556Z" level=info msg="Ensure that sandbox c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22 in task-service has been cleanup successfully" Mar 17 18:02:50.781618 containerd[1738]: time="2025-03-17T18:02:50.780790961Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:50.781618 containerd[1738]: time="2025-03-17T18:02:50.781151969Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:50.781618 containerd[1738]: time="2025-03-17T18:02:50.781011166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:1,}" Mar 17 18:02:50.781906 containerd[1738]: time="2025-03-17T18:02:50.781844783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:1,}" Mar 17 18:02:50.784607 containerd[1738]: time="2025-03-17T18:02:50.784581340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 18:02:50.845527 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0-shm.mount: Deactivated successfully. Mar 17 18:02:50.845655 systemd[1]: run-netns-cni\x2dab498a73\x2d606d\x2dffa7\x2d9e1e\x2d77758cdfd678.mount: Deactivated successfully. Mar 17 18:02:50.918879 containerd[1738]: time="2025-03-17T18:02:50.918826029Z" level=error msg="Failed to destroy network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.921360 containerd[1738]: time="2025-03-17T18:02:50.921249079Z" level=error msg="Failed to destroy network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.921805 containerd[1738]: time="2025-03-17T18:02:50.921683588Z" level=error msg="encountered an error cleaning up failed sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.921805 containerd[1738]: time="2025-03-17T18:02:50.921755690Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.921916 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd-shm.mount: Deactivated successfully. Mar 17 18:02:50.922683 kubelet[2614]: E0317 18:02:50.922258 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.922683 kubelet[2614]: E0317 18:02:50.922323 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:50.922683 kubelet[2614]: E0317 18:02:50.922350 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:50.922867 kubelet[2614]: E0317 18:02:50.922397 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:50.924511 containerd[1738]: time="2025-03-17T18:02:50.924476946Z" level=error msg="encountered an error cleaning up failed sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.924681 containerd[1738]: time="2025-03-17T18:02:50.924651650Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:1,} failed, error" error="failed to setup network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.925144 kubelet[2614]: E0317 18:02:50.924958 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:50.925144 kubelet[2614]: E0317 18:02:50.925008 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:50.925144 kubelet[2614]: E0317 18:02:50.925040 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:50.926366 kubelet[2614]: E0317 18:02:50.925085 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:50.926323 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4-shm.mount: Deactivated successfully. Mar 17 18:02:51.676921 kubelet[2614]: E0317 18:02:51.676868 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:51.786904 kubelet[2614]: I0317 18:02:51.786868 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4" Mar 17 18:02:51.788149 containerd[1738]: time="2025-03-17T18:02:51.787762980Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:51.788149 containerd[1738]: time="2025-03-17T18:02:51.788040186Z" level=info msg="Ensure that sandbox 8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4 in task-service has been cleanup successfully" Mar 17 18:02:51.791076 containerd[1738]: time="2025-03-17T18:02:51.788386693Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:51.791076 containerd[1738]: time="2025-03-17T18:02:51.788430794Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:51.791076 containerd[1738]: time="2025-03-17T18:02:51.790563438Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:51.791076 containerd[1738]: time="2025-03-17T18:02:51.790670441Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:51.791076 containerd[1738]: time="2025-03-17T18:02:51.790690141Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:51.793330 kubelet[2614]: I0317 18:02:51.791476 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd" Mar 17 18:02:51.792833 systemd[1]: run-netns-cni\x2d61f35584\x2d74fc\x2d8248\x2d7f98\x2da0a3795cfcad.mount: Deactivated successfully. Mar 17 18:02:51.793539 containerd[1738]: time="2025-03-17T18:02:51.791853165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:2,}" Mar 17 18:02:51.793539 containerd[1738]: time="2025-03-17T18:02:51.792112671Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:51.793539 containerd[1738]: time="2025-03-17T18:02:51.793171593Z" level=info msg="Ensure that sandbox 52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd in task-service has been cleanup successfully" Mar 17 18:02:51.796234 containerd[1738]: time="2025-03-17T18:02:51.794252615Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:51.796234 containerd[1738]: time="2025-03-17T18:02:51.794275216Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:51.796234 containerd[1738]: time="2025-03-17T18:02:51.794670724Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:51.796234 containerd[1738]: time="2025-03-17T18:02:51.794753025Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:51.796234 containerd[1738]: time="2025-03-17T18:02:51.794766526Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:51.796234 containerd[1738]: time="2025-03-17T18:02:51.795287937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:2,}" Mar 17 18:02:51.796826 systemd[1]: run-netns-cni\x2d4631e674\x2df2b5\x2dd6b8\x2d5a28\x2dfc0ba2b3a352.mount: Deactivated successfully. Mar 17 18:02:51.924234 containerd[1738]: time="2025-03-17T18:02:51.924170014Z" level=error msg="Failed to destroy network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.926344 containerd[1738]: time="2025-03-17T18:02:51.926302058Z" level=error msg="Failed to destroy network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.926639 containerd[1738]: time="2025-03-17T18:02:51.926603965Z" level=error msg="encountered an error cleaning up failed sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.926722 containerd[1738]: time="2025-03-17T18:02:51.926678566Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.927252 containerd[1738]: time="2025-03-17T18:02:51.926975072Z" level=error msg="encountered an error cleaning up failed sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.927252 containerd[1738]: time="2025-03-17T18:02:51.927032273Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:2,} failed, error" error="failed to setup network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.927411 kubelet[2614]: E0317 18:02:51.927262 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.927411 kubelet[2614]: E0317 18:02:51.927325 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:51.927411 kubelet[2614]: E0317 18:02:51.927352 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:51.927565 kubelet[2614]: E0317 18:02:51.927406 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:51.930499 kubelet[2614]: E0317 18:02:51.927688 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:51.930499 kubelet[2614]: E0317 18:02:51.927744 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:51.930499 kubelet[2614]: E0317 18:02:51.927776 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:51.928442 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba-shm.mount: Deactivated successfully. Mar 17 18:02:51.930892 kubelet[2614]: E0317 18:02:51.927816 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:51.932807 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c-shm.mount: Deactivated successfully. Mar 17 18:02:52.678111 kubelet[2614]: E0317 18:02:52.678004 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:52.794449 kubelet[2614]: I0317 18:02:52.794419 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c" Mar 17 18:02:52.795556 containerd[1738]: time="2025-03-17T18:02:52.795199109Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:52.795556 containerd[1738]: time="2025-03-17T18:02:52.795456914Z" level=info msg="Ensure that sandbox 27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c in task-service has been cleanup successfully" Mar 17 18:02:52.796015 containerd[1738]: time="2025-03-17T18:02:52.795770621Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:52.796015 containerd[1738]: time="2025-03-17T18:02:52.795793321Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:52.796435 containerd[1738]: time="2025-03-17T18:02:52.796385033Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:52.796526 containerd[1738]: time="2025-03-17T18:02:52.796492636Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:52.796526 containerd[1738]: time="2025-03-17T18:02:52.796509336Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:52.797014 containerd[1738]: time="2025-03-17T18:02:52.796852943Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:52.797014 containerd[1738]: time="2025-03-17T18:02:52.796942445Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:52.797014 containerd[1738]: time="2025-03-17T18:02:52.796957645Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:52.797189 kubelet[2614]: I0317 18:02:52.796999 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba" Mar 17 18:02:52.797915 containerd[1738]: time="2025-03-17T18:02:52.797620659Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:52.797915 containerd[1738]: time="2025-03-17T18:02:52.797657060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:3,}" Mar 17 18:02:52.797915 containerd[1738]: time="2025-03-17T18:02:52.797790463Z" level=info msg="Ensure that sandbox f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba in task-service has been cleanup successfully" Mar 17 18:02:52.798097 containerd[1738]: time="2025-03-17T18:02:52.798070168Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:52.798145 containerd[1738]: time="2025-03-17T18:02:52.798093569Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:52.798377 containerd[1738]: time="2025-03-17T18:02:52.798349974Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:52.798459 containerd[1738]: time="2025-03-17T18:02:52.798444076Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:52.798503 containerd[1738]: time="2025-03-17T18:02:52.798457776Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:52.798876 containerd[1738]: time="2025-03-17T18:02:52.798784983Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:52.798876 containerd[1738]: time="2025-03-17T18:02:52.798868085Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:52.799143 containerd[1738]: time="2025-03-17T18:02:52.798881685Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:52.799430 containerd[1738]: time="2025-03-17T18:02:52.799403796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:3,}" Mar 17 18:02:52.844330 systemd[1]: run-netns-cni\x2d1b5245e7\x2de119\x2d5833\x2dc98d\x2da1b60f0b8abd.mount: Deactivated successfully. Mar 17 18:02:52.844484 systemd[1]: run-netns-cni\x2d2b8c6381\x2d19ef\x2dbb8c\x2d9eb0\x2dad594ed48491.mount: Deactivated successfully. Mar 17 18:02:52.939070 containerd[1738]: time="2025-03-17T18:02:52.938705790Z" level=error msg="Failed to destroy network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.942753 containerd[1738]: time="2025-03-17T18:02:52.942683873Z" level=error msg="encountered an error cleaning up failed sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.942919 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75-shm.mount: Deactivated successfully. Mar 17 18:02:52.943357 containerd[1738]: time="2025-03-17T18:02:52.943298585Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:3,} failed, error" error="failed to setup network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.945031 kubelet[2614]: E0317 18:02:52.944395 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.945031 kubelet[2614]: E0317 18:02:52.944462 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:52.945031 kubelet[2614]: E0317 18:02:52.944492 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:52.945250 kubelet[2614]: E0317 18:02:52.944540 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:52.965612 containerd[1738]: time="2025-03-17T18:02:52.965555248Z" level=error msg="Failed to destroy network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.965937 containerd[1738]: time="2025-03-17T18:02:52.965899255Z" level=error msg="encountered an error cleaning up failed sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.966011 containerd[1738]: time="2025-03-17T18:02:52.965967556Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.966237 kubelet[2614]: E0317 18:02:52.966190 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:52.966329 kubelet[2614]: E0317 18:02:52.966260 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:52.966329 kubelet[2614]: E0317 18:02:52.966294 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:52.966409 kubelet[2614]: E0317 18:02:52.966356 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:53.678648 kubelet[2614]: E0317 18:02:53.678605 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:53.802913 kubelet[2614]: I0317 18:02:53.802878 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75" Mar 17 18:02:53.804984 containerd[1738]: time="2025-03-17T18:02:53.804374373Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:02:53.804984 containerd[1738]: time="2025-03-17T18:02:53.804625379Z" level=info msg="Ensure that sandbox a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75 in task-service has been cleanup successfully" Mar 17 18:02:53.804984 containerd[1738]: time="2025-03-17T18:02:53.804802282Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:02:53.804984 containerd[1738]: time="2025-03-17T18:02:53.804819383Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:02:53.805663 containerd[1738]: time="2025-03-17T18:02:53.805201091Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:53.805663 containerd[1738]: time="2025-03-17T18:02:53.805305293Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:53.805663 containerd[1738]: time="2025-03-17T18:02:53.805320493Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:53.805780 containerd[1738]: time="2025-03-17T18:02:53.805701801Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:53.805820 containerd[1738]: time="2025-03-17T18:02:53.805782403Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:53.805820 containerd[1738]: time="2025-03-17T18:02:53.805796303Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:53.811490 containerd[1738]: time="2025-03-17T18:02:53.808325255Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:53.811490 containerd[1738]: time="2025-03-17T18:02:53.808424358Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:53.811490 containerd[1738]: time="2025-03-17T18:02:53.808439458Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:53.811490 containerd[1738]: time="2025-03-17T18:02:53.810501201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:4,}" Mar 17 18:02:53.811709 kubelet[2614]: I0317 18:02:53.808993 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08" Mar 17 18:02:53.816122 containerd[1738]: time="2025-03-17T18:02:53.816097017Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:02:53.816472 containerd[1738]: time="2025-03-17T18:02:53.816448124Z" level=info msg="Ensure that sandbox dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08 in task-service has been cleanup successfully" Mar 17 18:02:53.816706 containerd[1738]: time="2025-03-17T18:02:53.816688029Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:02:53.816788 containerd[1738]: time="2025-03-17T18:02:53.816775131Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:02:53.818038 containerd[1738]: time="2025-03-17T18:02:53.818009057Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:53.818138 containerd[1738]: time="2025-03-17T18:02:53.818098358Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:53.818138 containerd[1738]: time="2025-03-17T18:02:53.818114059Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:53.819244 containerd[1738]: time="2025-03-17T18:02:53.819179081Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:53.819345 containerd[1738]: time="2025-03-17T18:02:53.819293083Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:53.819345 containerd[1738]: time="2025-03-17T18:02:53.819309184Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:53.819617 containerd[1738]: time="2025-03-17T18:02:53.819591590Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:53.819705 containerd[1738]: time="2025-03-17T18:02:53.819678291Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:53.819705 containerd[1738]: time="2025-03-17T18:02:53.819691792Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:53.820218 containerd[1738]: time="2025-03-17T18:02:53.820169502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:4,}" Mar 17 18:02:53.843852 systemd[1]: run-netns-cni\x2da41fd9a7\x2dc2bd\x2d25ff\x2dfb9e\x2d2ac0571f2865.mount: Deactivated successfully. Mar 17 18:02:53.844221 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08-shm.mount: Deactivated successfully. Mar 17 18:02:53.844367 systemd[1]: run-netns-cni\x2d4fc1e9a2\x2d12e1\x2db36b\x2d4225\x2d2b991a651345.mount: Deactivated successfully. Mar 17 18:02:53.966852 containerd[1738]: time="2025-03-17T18:02:53.966139834Z" level=error msg="Failed to destroy network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.969224 containerd[1738]: time="2025-03-17T18:02:53.967644265Z" level=error msg="encountered an error cleaning up failed sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.969100 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e-shm.mount: Deactivated successfully. Mar 17 18:02:53.969879 containerd[1738]: time="2025-03-17T18:02:53.969639307Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.970192 kubelet[2614]: E0317 18:02:53.970129 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.972305 kubelet[2614]: E0317 18:02:53.972126 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:53.972305 kubelet[2614]: E0317 18:02:53.972174 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:53.972549 kubelet[2614]: E0317 18:02:53.972459 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:53.973744 containerd[1738]: time="2025-03-17T18:02:53.973714391Z" level=error msg="Failed to destroy network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.974426 containerd[1738]: time="2025-03-17T18:02:53.974111800Z" level=error msg="encountered an error cleaning up failed sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.974426 containerd[1738]: time="2025-03-17T18:02:53.974192401Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:4,} failed, error" error="failed to setup network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.974580 kubelet[2614]: E0317 18:02:53.974529 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:53.974632 kubelet[2614]: E0317 18:02:53.974575 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:53.974632 kubelet[2614]: E0317 18:02:53.974598 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:53.974718 kubelet[2614]: E0317 18:02:53.974645 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:54.666641 kubelet[2614]: E0317 18:02:54.666568 2614 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:54.679189 kubelet[2614]: E0317 18:02:54.679062 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:54.813378 kubelet[2614]: I0317 18:02:54.813347 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749" Mar 17 18:02:54.814224 containerd[1738]: time="2025-03-17T18:02:54.814014948Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:02:54.814954 containerd[1738]: time="2025-03-17T18:02:54.814381655Z" level=info msg="Ensure that sandbox 65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749 in task-service has been cleanup successfully" Mar 17 18:02:54.814954 containerd[1738]: time="2025-03-17T18:02:54.814548859Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:02:54.814954 containerd[1738]: time="2025-03-17T18:02:54.814566659Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:02:54.815086 containerd[1738]: time="2025-03-17T18:02:54.815000268Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:02:54.815086 containerd[1738]: time="2025-03-17T18:02:54.815079470Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:02:54.815163 containerd[1738]: time="2025-03-17T18:02:54.815093470Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:02:54.815717 containerd[1738]: time="2025-03-17T18:02:54.815560780Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:54.815717 containerd[1738]: time="2025-03-17T18:02:54.815651482Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:54.815717 containerd[1738]: time="2025-03-17T18:02:54.815664882Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:54.816136 containerd[1738]: time="2025-03-17T18:02:54.816108691Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:54.816232 containerd[1738]: time="2025-03-17T18:02:54.816200293Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:54.816286 containerd[1738]: time="2025-03-17T18:02:54.816228794Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:54.816716 containerd[1738]: time="2025-03-17T18:02:54.816577701Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:54.816716 containerd[1738]: time="2025-03-17T18:02:54.816664703Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:54.816716 containerd[1738]: time="2025-03-17T18:02:54.816681203Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:54.817362 containerd[1738]: time="2025-03-17T18:02:54.817064811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:5,}" Mar 17 18:02:54.817447 kubelet[2614]: I0317 18:02:54.816825 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e" Mar 17 18:02:54.817533 containerd[1738]: time="2025-03-17T18:02:54.817509520Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:02:54.817720 containerd[1738]: time="2025-03-17T18:02:54.817697024Z" level=info msg="Ensure that sandbox d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e in task-service has been cleanup successfully" Mar 17 18:02:54.817856 containerd[1738]: time="2025-03-17T18:02:54.817834227Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:02:54.817856 containerd[1738]: time="2025-03-17T18:02:54.817851127Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:02:54.818128 containerd[1738]: time="2025-03-17T18:02:54.818107033Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:02:54.818325 containerd[1738]: time="2025-03-17T18:02:54.818288736Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:02:54.818325 containerd[1738]: time="2025-03-17T18:02:54.818306537Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:02:54.818741 containerd[1738]: time="2025-03-17T18:02:54.818717945Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:54.818836 containerd[1738]: time="2025-03-17T18:02:54.818814847Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:54.818836 containerd[1738]: time="2025-03-17T18:02:54.818832348Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:54.819182 containerd[1738]: time="2025-03-17T18:02:54.819086353Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:54.819182 containerd[1738]: time="2025-03-17T18:02:54.819176855Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:54.819358 containerd[1738]: time="2025-03-17T18:02:54.819192355Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:54.819506 containerd[1738]: time="2025-03-17T18:02:54.819476961Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:54.819595 containerd[1738]: time="2025-03-17T18:02:54.819565863Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:54.819595 containerd[1738]: time="2025-03-17T18:02:54.819583563Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:54.820126 containerd[1738]: time="2025-03-17T18:02:54.820047173Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:5,}" Mar 17 18:02:54.842712 systemd[1]: run-netns-cni\x2d1a1d9459\x2d84d4\x2db128\x2ddc05\x2d03ca416b47b0.mount: Deactivated successfully. Mar 17 18:02:54.842821 systemd[1]: run-netns-cni\x2d2988705f\x2d7d9d\x2db9ae\x2d42ff\x2db7885afd53bf.mount: Deactivated successfully. Mar 17 18:02:54.842905 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749-shm.mount: Deactivated successfully. Mar 17 18:02:55.681310 kubelet[2614]: E0317 18:02:55.680308 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:55.783834 containerd[1738]: time="2025-03-17T18:02:55.783780708Z" level=error msg="Failed to destroy network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.784763 containerd[1738]: time="2025-03-17T18:02:55.784511819Z" level=error msg="encountered an error cleaning up failed sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.784763 containerd[1738]: time="2025-03-17T18:02:55.784580120Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:5,} failed, error" error="failed to setup network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.785569 kubelet[2614]: E0317 18:02:55.785090 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.785569 kubelet[2614]: E0317 18:02:55.785155 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:55.785569 kubelet[2614]: E0317 18:02:55.785182 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:55.785756 kubelet[2614]: E0317 18:02:55.785268 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:55.817676 containerd[1738]: time="2025-03-17T18:02:55.817344987Z" level=error msg="Failed to destroy network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.818079 containerd[1738]: time="2025-03-17T18:02:55.817765993Z" level=error msg="encountered an error cleaning up failed sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.818079 containerd[1738]: time="2025-03-17T18:02:55.817852495Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.818164 kubelet[2614]: E0317 18:02:55.818046 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:55.818164 kubelet[2614]: E0317 18:02:55.818110 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:55.818164 kubelet[2614]: E0317 18:02:55.818139 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:55.819005 kubelet[2614]: E0317 18:02:55.818187 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:55.823037 kubelet[2614]: I0317 18:02:55.822415 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287" Mar 17 18:02:55.823668 containerd[1738]: time="2025-03-17T18:02:55.823645477Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:02:55.824277 containerd[1738]: time="2025-03-17T18:02:55.824253686Z" level=info msg="Ensure that sandbox 41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287 in task-service has been cleanup successfully" Mar 17 18:02:55.824601 containerd[1738]: time="2025-03-17T18:02:55.824502190Z" level=info msg="TearDown network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" successfully" Mar 17 18:02:55.824601 containerd[1738]: time="2025-03-17T18:02:55.824523190Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" returns successfully" Mar 17 18:02:55.824950 containerd[1738]: time="2025-03-17T18:02:55.824929896Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:02:55.825328 containerd[1738]: time="2025-03-17T18:02:55.825220500Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:02:55.825328 containerd[1738]: time="2025-03-17T18:02:55.825240100Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:02:55.825907 containerd[1738]: time="2025-03-17T18:02:55.825833209Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:02:55.826284 containerd[1738]: time="2025-03-17T18:02:55.826200914Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:02:55.826447 containerd[1738]: time="2025-03-17T18:02:55.826360116Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:02:55.826797 containerd[1738]: time="2025-03-17T18:02:55.826777222Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:55.827259 containerd[1738]: time="2025-03-17T18:02:55.827062726Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:55.827259 containerd[1738]: time="2025-03-17T18:02:55.827083726Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:55.827720 containerd[1738]: time="2025-03-17T18:02:55.827697635Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:55.827993 containerd[1738]: time="2025-03-17T18:02:55.827928739Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:55.827993 containerd[1738]: time="2025-03-17T18:02:55.827946939Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:55.828343 containerd[1738]: time="2025-03-17T18:02:55.828317944Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:55.828476 containerd[1738]: time="2025-03-17T18:02:55.828404845Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:55.828476 containerd[1738]: time="2025-03-17T18:02:55.828424446Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:55.830119 containerd[1738]: time="2025-03-17T18:02:55.829836066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:6,}" Mar 17 18:02:55.831452 kubelet[2614]: I0317 18:02:55.831410 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540" Mar 17 18:02:55.832124 containerd[1738]: time="2025-03-17T18:02:55.832099498Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:02:55.832444 containerd[1738]: time="2025-03-17T18:02:55.832416003Z" level=info msg="Ensure that sandbox dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540 in task-service has been cleanup successfully" Mar 17 18:02:55.832579 containerd[1738]: time="2025-03-17T18:02:55.832556805Z" level=info msg="TearDown network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" successfully" Mar 17 18:02:55.832579 containerd[1738]: time="2025-03-17T18:02:55.832575705Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" returns successfully" Mar 17 18:02:55.833387 containerd[1738]: time="2025-03-17T18:02:55.833359216Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:02:55.833463 containerd[1738]: time="2025-03-17T18:02:55.833444717Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:02:55.833463 containerd[1738]: time="2025-03-17T18:02:55.833458817Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:02:55.834193 containerd[1738]: time="2025-03-17T18:02:55.834069126Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:02:55.834193 containerd[1738]: time="2025-03-17T18:02:55.834159527Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:02:55.834193 containerd[1738]: time="2025-03-17T18:02:55.834172828Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:02:55.834586 containerd[1738]: time="2025-03-17T18:02:55.834424031Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:55.834586 containerd[1738]: time="2025-03-17T18:02:55.834502632Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:55.834586 containerd[1738]: time="2025-03-17T18:02:55.834514533Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:55.836073 containerd[1738]: time="2025-03-17T18:02:55.835968553Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:55.836073 containerd[1738]: time="2025-03-17T18:02:55.836057655Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:55.836073 containerd[1738]: time="2025-03-17T18:02:55.836072855Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:55.836780 containerd[1738]: time="2025-03-17T18:02:55.836680063Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:55.836780 containerd[1738]: time="2025-03-17T18:02:55.836766165Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:55.836780 containerd[1738]: time="2025-03-17T18:02:55.836779265Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:55.837842 containerd[1738]: time="2025-03-17T18:02:55.837815080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:6,}" Mar 17 18:02:55.842600 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287-shm.mount: Deactivated successfully. Mar 17 18:02:55.842934 systemd[1]: run-netns-cni\x2d2651b5ec\x2da6a1\x2d663d\x2d9048\x2d6c3729c18171.mount: Deactivated successfully. Mar 17 18:02:55.843022 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540-shm.mount: Deactivated successfully. Mar 17 18:02:56.013441 containerd[1738]: time="2025-03-17T18:02:56.013313585Z" level=error msg="Failed to destroy network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.015775 containerd[1738]: time="2025-03-17T18:02:56.015546017Z" level=error msg="encountered an error cleaning up failed sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.015775 containerd[1738]: time="2025-03-17T18:02:56.015663718Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.016071 kubelet[2614]: E0317 18:02:56.016030 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.016145 kubelet[2614]: E0317 18:02:56.016102 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:56.016145 kubelet[2614]: E0317 18:02:56.016131 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:56.018234 kubelet[2614]: E0317 18:02:56.016186 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:56.020041 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31-shm.mount: Deactivated successfully. Mar 17 18:02:56.030340 containerd[1738]: time="2025-03-17T18:02:56.030187826Z" level=error msg="Failed to destroy network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.030926 containerd[1738]: time="2025-03-17T18:02:56.030884336Z" level=error msg="encountered an error cleaning up failed sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.031013 containerd[1738]: time="2025-03-17T18:02:56.030979337Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:6,} failed, error" error="failed to setup network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.031373 kubelet[2614]: E0317 18:02:56.031278 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:56.031456 kubelet[2614]: E0317 18:02:56.031384 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:56.031456 kubelet[2614]: E0317 18:02:56.031412 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:56.032227 kubelet[2614]: E0317 18:02:56.031982 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:56.033021 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3-shm.mount: Deactivated successfully. Mar 17 18:02:56.681541 kubelet[2614]: E0317 18:02:56.681497 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:56.837070 kubelet[2614]: I0317 18:02:56.836797 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3" Mar 17 18:02:56.838225 containerd[1738]: time="2025-03-17T18:02:56.837652452Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" Mar 17 18:02:56.838225 containerd[1738]: time="2025-03-17T18:02:56.837881455Z" level=info msg="Ensure that sandbox 9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3 in task-service has been cleanup successfully" Mar 17 18:02:56.838225 containerd[1738]: time="2025-03-17T18:02:56.838133559Z" level=info msg="TearDown network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" successfully" Mar 17 18:02:56.838225 containerd[1738]: time="2025-03-17T18:02:56.838153159Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" returns successfully" Mar 17 18:02:56.839169 containerd[1738]: time="2025-03-17T18:02:56.839142573Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:02:56.839539 containerd[1738]: time="2025-03-17T18:02:56.839372876Z" level=info msg="TearDown network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" successfully" Mar 17 18:02:56.839539 containerd[1738]: time="2025-03-17T18:02:56.839392477Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" returns successfully" Mar 17 18:02:56.839958 containerd[1738]: time="2025-03-17T18:02:56.839936884Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:02:56.840191 containerd[1738]: time="2025-03-17T18:02:56.840173388Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:02:56.840378 containerd[1738]: time="2025-03-17T18:02:56.840290889Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:02:56.840954 containerd[1738]: time="2025-03-17T18:02:56.840792697Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:02:56.840954 containerd[1738]: time="2025-03-17T18:02:56.840878898Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:02:56.840954 containerd[1738]: time="2025-03-17T18:02:56.840892198Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:02:56.845403 containerd[1738]: time="2025-03-17T18:02:56.845378062Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:56.845529 systemd[1]: run-netns-cni\x2d67b1f714\x2dde1f\x2d4108\x2d04b9\x2d647bdfe4e65e.mount: Deactivated successfully. Mar 17 18:02:56.846160 containerd[1738]: time="2025-03-17T18:02:56.845645966Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:56.846160 containerd[1738]: time="2025-03-17T18:02:56.845665366Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:56.847814 containerd[1738]: time="2025-03-17T18:02:56.847681295Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:56.847814 containerd[1738]: time="2025-03-17T18:02:56.847760196Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:56.847814 containerd[1738]: time="2025-03-17T18:02:56.847772496Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:56.849193 containerd[1738]: time="2025-03-17T18:02:56.849170916Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:56.849518 containerd[1738]: time="2025-03-17T18:02:56.849367619Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:56.849518 containerd[1738]: time="2025-03-17T18:02:56.849387019Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:56.849976 containerd[1738]: time="2025-03-17T18:02:56.849953127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:7,}" Mar 17 18:02:56.850602 kubelet[2614]: I0317 18:02:56.850478 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31" Mar 17 18:02:56.852130 containerd[1738]: time="2025-03-17T18:02:56.851990856Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" Mar 17 18:02:56.852240 containerd[1738]: time="2025-03-17T18:02:56.852197259Z" level=info msg="Ensure that sandbox a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31 in task-service has been cleanup successfully" Mar 17 18:02:56.852527 containerd[1738]: time="2025-03-17T18:02:56.852384562Z" level=info msg="TearDown network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" successfully" Mar 17 18:02:56.852527 containerd[1738]: time="2025-03-17T18:02:56.852403762Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" returns successfully" Mar 17 18:02:56.853862 containerd[1738]: time="2025-03-17T18:02:56.853832083Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:02:56.853947 containerd[1738]: time="2025-03-17T18:02:56.853922984Z" level=info msg="TearDown network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" successfully" Mar 17 18:02:56.853947 containerd[1738]: time="2025-03-17T18:02:56.853937784Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" returns successfully" Mar 17 18:02:56.856248 systemd[1]: run-netns-cni\x2da14a906b\x2d5509\x2d8f37\x2d8afa\x2dca8a3fba79b8.mount: Deactivated successfully. Mar 17 18:02:56.858364 containerd[1738]: time="2025-03-17T18:02:56.858336847Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:02:56.858490 containerd[1738]: time="2025-03-17T18:02:56.858424548Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:02:56.858490 containerd[1738]: time="2025-03-17T18:02:56.858443148Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:02:56.858809 containerd[1738]: time="2025-03-17T18:02:56.858787053Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:02:56.858887 containerd[1738]: time="2025-03-17T18:02:56.858868155Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:02:56.858938 containerd[1738]: time="2025-03-17T18:02:56.858885855Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:02:56.859671 containerd[1738]: time="2025-03-17T18:02:56.859606465Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:56.859753 containerd[1738]: time="2025-03-17T18:02:56.859692266Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:56.859753 containerd[1738]: time="2025-03-17T18:02:56.859705567Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:56.860838 containerd[1738]: time="2025-03-17T18:02:56.860811382Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:56.860906 containerd[1738]: time="2025-03-17T18:02:56.860888283Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:56.860959 containerd[1738]: time="2025-03-17T18:02:56.860902784Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:56.862558 containerd[1738]: time="2025-03-17T18:02:56.862531507Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:56.862692 containerd[1738]: time="2025-03-17T18:02:56.862620508Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:56.862692 containerd[1738]: time="2025-03-17T18:02:56.862638608Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:56.864046 containerd[1738]: time="2025-03-17T18:02:56.863935627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:7,}" Mar 17 18:02:57.021819 containerd[1738]: time="2025-03-17T18:02:57.020698565Z" level=error msg="Failed to destroy network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.021819 containerd[1738]: time="2025-03-17T18:02:57.021455975Z" level=error msg="encountered an error cleaning up failed sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.021819 containerd[1738]: time="2025-03-17T18:02:57.021525976Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:7,} failed, error" error="failed to setup network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.022658 kubelet[2614]: E0317 18:02:57.021920 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.022658 kubelet[2614]: E0317 18:02:57.021997 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:57.022658 kubelet[2614]: E0317 18:02:57.022035 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:57.022819 kubelet[2614]: E0317 18:02:57.022105 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:57.043958 containerd[1738]: time="2025-03-17T18:02:57.043651392Z" level=error msg="Failed to destroy network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.044443 containerd[1738]: time="2025-03-17T18:02:57.044288301Z" level=error msg="encountered an error cleaning up failed sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.044443 containerd[1738]: time="2025-03-17T18:02:57.044377803Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.044960 kubelet[2614]: E0317 18:02:57.044820 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:57.044960 kubelet[2614]: E0317 18:02:57.044883 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:57.044960 kubelet[2614]: E0317 18:02:57.044910 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:57.045388 kubelet[2614]: E0317 18:02:57.044975 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:57.682449 kubelet[2614]: E0317 18:02:57.682348 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:57.843543 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436-shm.mount: Deactivated successfully. Mar 17 18:02:57.843823 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7-shm.mount: Deactivated successfully. Mar 17 18:02:57.859344 kubelet[2614]: I0317 18:02:57.859316 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7" Mar 17 18:02:57.860425 containerd[1738]: time="2025-03-17T18:02:57.860393751Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\"" Mar 17 18:02:57.861175 containerd[1738]: time="2025-03-17T18:02:57.861031360Z" level=info msg="Ensure that sandbox 6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7 in task-service has been cleanup successfully" Mar 17 18:02:57.861406 containerd[1738]: time="2025-03-17T18:02:57.861384965Z" level=info msg="TearDown network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" successfully" Mar 17 18:02:57.861495 containerd[1738]: time="2025-03-17T18:02:57.861480666Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" returns successfully" Mar 17 18:02:57.864016 systemd[1]: run-netns-cni\x2d655cdf25\x2d0591\x2d24e8\x2d961e\x2d0bd6e362a124.mount: Deactivated successfully. Mar 17 18:02:57.864300 containerd[1738]: time="2025-03-17T18:02:57.864139504Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" Mar 17 18:02:57.864300 containerd[1738]: time="2025-03-17T18:02:57.864247506Z" level=info msg="TearDown network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" successfully" Mar 17 18:02:57.864617 containerd[1738]: time="2025-03-17T18:02:57.864299106Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" returns successfully" Mar 17 18:02:57.864774 containerd[1738]: time="2025-03-17T18:02:57.864673912Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:02:57.864774 containerd[1738]: time="2025-03-17T18:02:57.864757613Z" level=info msg="TearDown network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" successfully" Mar 17 18:02:57.864774 containerd[1738]: time="2025-03-17T18:02:57.864771113Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" returns successfully" Mar 17 18:02:57.866269 containerd[1738]: time="2025-03-17T18:02:57.865935530Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:02:57.866269 containerd[1738]: time="2025-03-17T18:02:57.866073732Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:02:57.866269 containerd[1738]: time="2025-03-17T18:02:57.866090032Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:02:57.866576 containerd[1738]: time="2025-03-17T18:02:57.866554439Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:02:57.866665 containerd[1738]: time="2025-03-17T18:02:57.866640340Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:02:57.866665 containerd[1738]: time="2025-03-17T18:02:57.866660440Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:02:57.867507 containerd[1738]: time="2025-03-17T18:02:57.867110847Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:57.867667 containerd[1738]: time="2025-03-17T18:02:57.867197848Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:57.867750 containerd[1738]: time="2025-03-17T18:02:57.867734456Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:57.868643 containerd[1738]: time="2025-03-17T18:02:57.868355464Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:57.868643 containerd[1738]: time="2025-03-17T18:02:57.868434466Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:57.868643 containerd[1738]: time="2025-03-17T18:02:57.868448166Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:57.869064 containerd[1738]: time="2025-03-17T18:02:57.868934173Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:57.869537 containerd[1738]: time="2025-03-17T18:02:57.869419980Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:57.869537 containerd[1738]: time="2025-03-17T18:02:57.869441580Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:57.870546 containerd[1738]: time="2025-03-17T18:02:57.870040488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:8,}" Mar 17 18:02:57.871245 kubelet[2614]: I0317 18:02:57.871228 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436" Mar 17 18:02:57.872290 containerd[1738]: time="2025-03-17T18:02:57.872231720Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\"" Mar 17 18:02:57.872480 containerd[1738]: time="2025-03-17T18:02:57.872453523Z" level=info msg="Ensure that sandbox 5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436 in task-service has been cleanup successfully" Mar 17 18:02:57.872690 containerd[1738]: time="2025-03-17T18:02:57.872603725Z" level=info msg="TearDown network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" successfully" Mar 17 18:02:57.872690 containerd[1738]: time="2025-03-17T18:02:57.872623925Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" returns successfully" Mar 17 18:02:57.875997 containerd[1738]: time="2025-03-17T18:02:57.874964959Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" Mar 17 18:02:57.875997 containerd[1738]: time="2025-03-17T18:02:57.875056760Z" level=info msg="TearDown network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" successfully" Mar 17 18:02:57.875997 containerd[1738]: time="2025-03-17T18:02:57.875069360Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" returns successfully" Mar 17 18:02:57.875872 systemd[1]: run-netns-cni\x2d0b55a4be\x2d9a42\x2dcf91\x2db601\x2d7f52e5cb3f26.mount: Deactivated successfully. Mar 17 18:02:57.880492 containerd[1738]: time="2025-03-17T18:02:57.880461937Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:02:57.880569 containerd[1738]: time="2025-03-17T18:02:57.880548938Z" level=info msg="TearDown network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" successfully" Mar 17 18:02:57.880569 containerd[1738]: time="2025-03-17T18:02:57.880563539Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" returns successfully" Mar 17 18:02:57.880911 containerd[1738]: time="2025-03-17T18:02:57.880886443Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:02:57.880984 containerd[1738]: time="2025-03-17T18:02:57.880971844Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:02:57.881029 containerd[1738]: time="2025-03-17T18:02:57.880985645Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:02:57.881418 containerd[1738]: time="2025-03-17T18:02:57.881396251Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:02:57.881757 containerd[1738]: time="2025-03-17T18:02:57.881655054Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:02:57.881757 containerd[1738]: time="2025-03-17T18:02:57.881673154Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:02:57.882769 containerd[1738]: time="2025-03-17T18:02:57.882622568Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:57.882769 containerd[1738]: time="2025-03-17T18:02:57.882734270Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:57.882769 containerd[1738]: time="2025-03-17T18:02:57.882748570Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:57.883832 containerd[1738]: time="2025-03-17T18:02:57.883629282Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:57.883832 containerd[1738]: time="2025-03-17T18:02:57.883716584Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:57.883832 containerd[1738]: time="2025-03-17T18:02:57.883733384Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:57.884156 containerd[1738]: time="2025-03-17T18:02:57.884137490Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:57.884337 containerd[1738]: time="2025-03-17T18:02:57.884319092Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:57.884440 containerd[1738]: time="2025-03-17T18:02:57.884424094Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:57.885237 containerd[1738]: time="2025-03-17T18:02:57.885194705Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:8,}" Mar 17 18:02:58.030282 containerd[1738]: time="2025-03-17T18:02:58.030237675Z" level=error msg="Failed to destroy network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.031032 containerd[1738]: time="2025-03-17T18:02:58.030996386Z" level=error msg="encountered an error cleaning up failed sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.031228 containerd[1738]: time="2025-03-17T18:02:58.031183089Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:8,} failed, error" error="failed to setup network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.031607 kubelet[2614]: E0317 18:02:58.031567 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.031713 kubelet[2614]: E0317 18:02:58.031635 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:58.031713 kubelet[2614]: E0317 18:02:58.031665 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:58.031791 kubelet[2614]: E0317 18:02:58.031718 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:58.052260 containerd[1738]: time="2025-03-17T18:02:58.052224389Z" level=error msg="Failed to destroy network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.053378 containerd[1738]: time="2025-03-17T18:02:58.053346305Z" level=error msg="encountered an error cleaning up failed sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.053553 containerd[1738]: time="2025-03-17T18:02:58.053527408Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:8,} failed, error" error="failed to setup network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.053891 kubelet[2614]: E0317 18:02:58.053844 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:58.054010 kubelet[2614]: E0317 18:02:58.053983 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:58.054076 kubelet[2614]: E0317 18:02:58.054022 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:58.054642 kubelet[2614]: E0317 18:02:58.054594 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:58.683061 kubelet[2614]: E0317 18:02:58.682751 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:58.808795 containerd[1738]: time="2025-03-17T18:02:58.808741365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:58.810714 containerd[1738]: time="2025-03-17T18:02:58.810663893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 17 18:02:58.814518 containerd[1738]: time="2025-03-17T18:02:58.814462949Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:58.820435 containerd[1738]: time="2025-03-17T18:02:58.820284935Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:02:58.821756 containerd[1738]: time="2025-03-17T18:02:58.821159148Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 8.036538307s" Mar 17 18:02:58.821756 containerd[1738]: time="2025-03-17T18:02:58.821202349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 17 18:02:58.828623 containerd[1738]: time="2025-03-17T18:02:58.828590958Z" level=info msg="CreateContainer within sandbox \"3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 18:02:58.844604 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb-shm.mount: Deactivated successfully. Mar 17 18:02:58.844733 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2864695910.mount: Deactivated successfully. Mar 17 18:02:58.867568 containerd[1738]: time="2025-03-17T18:02:58.867533435Z" level=info msg="CreateContainer within sandbox \"3a9c485618c11cd2852d1d903167542165ceb0efe77c33877184bee69b861abb\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"8764cf00eaa658309ac432fa8d7452f2a45c5b9c0d80f28cc517601e149ce2c4\"" Mar 17 18:02:58.868269 containerd[1738]: time="2025-03-17T18:02:58.868118743Z" level=info msg="StartContainer for \"8764cf00eaa658309ac432fa8d7452f2a45c5b9c0d80f28cc517601e149ce2c4\"" Mar 17 18:02:58.882684 kubelet[2614]: I0317 18:02:58.881587 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb" Mar 17 18:02:58.882812 containerd[1738]: time="2025-03-17T18:02:58.882514656Z" level=info msg="StopPodSandbox for \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\"" Mar 17 18:02:58.883106 containerd[1738]: time="2025-03-17T18:02:58.883084865Z" level=info msg="Ensure that sandbox ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb in task-service has been cleanup successfully" Mar 17 18:02:58.883672 containerd[1738]: time="2025-03-17T18:02:58.883647673Z" level=info msg="TearDown network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" successfully" Mar 17 18:02:58.884023 containerd[1738]: time="2025-03-17T18:02:58.883998078Z" level=info msg="StopPodSandbox for \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" returns successfully" Mar 17 18:02:58.885174 containerd[1738]: time="2025-03-17T18:02:58.885151495Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\"" Mar 17 18:02:58.885945 containerd[1738]: time="2025-03-17T18:02:58.885923407Z" level=info msg="TearDown network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" successfully" Mar 17 18:02:58.886096 containerd[1738]: time="2025-03-17T18:02:58.885998808Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" returns successfully" Mar 17 18:02:58.887320 containerd[1738]: time="2025-03-17T18:02:58.886879221Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" Mar 17 18:02:58.887320 containerd[1738]: time="2025-03-17T18:02:58.887167125Z" level=info msg="TearDown network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" successfully" Mar 17 18:02:58.887320 containerd[1738]: time="2025-03-17T18:02:58.887181625Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" returns successfully" Mar 17 18:02:58.888569 containerd[1738]: time="2025-03-17T18:02:58.888403443Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:02:58.888569 containerd[1738]: time="2025-03-17T18:02:58.888501145Z" level=info msg="TearDown network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" successfully" Mar 17 18:02:58.888569 containerd[1738]: time="2025-03-17T18:02:58.888515845Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" returns successfully" Mar 17 18:02:58.889422 containerd[1738]: time="2025-03-17T18:02:58.889251856Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:02:58.889422 containerd[1738]: time="2025-03-17T18:02:58.889350157Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:02:58.889422 containerd[1738]: time="2025-03-17T18:02:58.889363358Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:02:58.889721 systemd[1]: run-netns-cni\x2dce6bd64d\x2d31be\x2db18e\x2dbc9c\x2dbc94f0c6eb63.mount: Deactivated successfully. Mar 17 18:02:58.891487 containerd[1738]: time="2025-03-17T18:02:58.889885765Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:02:58.891487 containerd[1738]: time="2025-03-17T18:02:58.889963767Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:02:58.891487 containerd[1738]: time="2025-03-17T18:02:58.889988767Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:02:58.892284 containerd[1738]: time="2025-03-17T18:02:58.892058098Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:58.892284 containerd[1738]: time="2025-03-17T18:02:58.892174099Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:58.892284 containerd[1738]: time="2025-03-17T18:02:58.892189100Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:58.893767 containerd[1738]: time="2025-03-17T18:02:58.893337617Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:58.894065 containerd[1738]: time="2025-03-17T18:02:58.894016527Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:58.894214 kubelet[2614]: I0317 18:02:58.894162 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5" Mar 17 18:02:58.895607 containerd[1738]: time="2025-03-17T18:02:58.894307831Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:58.895607 containerd[1738]: time="2025-03-17T18:02:58.895233445Z" level=info msg="StopPodSandbox for \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\"" Mar 17 18:02:58.895607 containerd[1738]: time="2025-03-17T18:02:58.895463848Z" level=info msg="Ensure that sandbox 1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5 in task-service has been cleanup successfully" Mar 17 18:02:58.897152 containerd[1738]: time="2025-03-17T18:02:58.897106072Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:58.897982 containerd[1738]: time="2025-03-17T18:02:58.897956985Z" level=info msg="TearDown network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" successfully" Mar 17 18:02:58.898117 containerd[1738]: time="2025-03-17T18:02:58.898098787Z" level=info msg="StopPodSandbox for \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" returns successfully" Mar 17 18:02:58.898437 containerd[1738]: time="2025-03-17T18:02:58.898413792Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:58.898823 containerd[1738]: time="2025-03-17T18:02:58.898771797Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:58.898997 containerd[1738]: time="2025-03-17T18:02:58.898642795Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\"" Mar 17 18:02:58.900074 containerd[1738]: time="2025-03-17T18:02:58.900051916Z" level=info msg="TearDown network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" successfully" Mar 17 18:02:58.900269 containerd[1738]: time="2025-03-17T18:02:58.900192818Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" returns successfully" Mar 17 18:02:58.900432 containerd[1738]: time="2025-03-17T18:02:58.900175318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:9,}" Mar 17 18:02:58.900626 systemd[1]: run-netns-cni\x2d3620ea06\x2d721a\x2d993b\x2d85a7\x2d2bfd3a51d141.mount: Deactivated successfully. Mar 17 18:02:58.901816 containerd[1738]: time="2025-03-17T18:02:58.901170932Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.901923444Z" level=info msg="TearDown network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" successfully" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.901944344Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" returns successfully" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.902333550Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.902418251Z" level=info msg="TearDown network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" successfully" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.902431251Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" returns successfully" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.902725355Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.902948559Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:02:58.903129 containerd[1738]: time="2025-03-17T18:02:58.902965159Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:02:58.903466 containerd[1738]: time="2025-03-17T18:02:58.903381265Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:02:58.903506 containerd[1738]: time="2025-03-17T18:02:58.903461666Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:02:58.903506 containerd[1738]: time="2025-03-17T18:02:58.903474567Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:02:58.903815 containerd[1738]: time="2025-03-17T18:02:58.903790771Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:58.904123 containerd[1738]: time="2025-03-17T18:02:58.904102976Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:58.904252 containerd[1738]: time="2025-03-17T18:02:58.904231678Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:58.904624 containerd[1738]: time="2025-03-17T18:02:58.904601183Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:58.904885 containerd[1738]: time="2025-03-17T18:02:58.904865187Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:58.905559 containerd[1738]: time="2025-03-17T18:02:58.905531097Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:58.905942 containerd[1738]: time="2025-03-17T18:02:58.905918603Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:58.906201 containerd[1738]: time="2025-03-17T18:02:58.906180607Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:58.906312 containerd[1738]: time="2025-03-17T18:02:58.906293608Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:58.907075 containerd[1738]: time="2025-03-17T18:02:58.907045719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:9,}" Mar 17 18:02:58.917710 systemd[1]: Started cri-containerd-8764cf00eaa658309ac432fa8d7452f2a45c5b9c0d80f28cc517601e149ce2c4.scope - libcontainer container 8764cf00eaa658309ac432fa8d7452f2a45c5b9c0d80f28cc517601e149ce2c4. Mar 17 18:02:58.955695 containerd[1738]: time="2025-03-17T18:02:58.955565937Z" level=info msg="StartContainer for \"8764cf00eaa658309ac432fa8d7452f2a45c5b9c0d80f28cc517601e149ce2c4\" returns successfully" Mar 17 18:02:59.049882 containerd[1738]: time="2025-03-17T18:02:59.049801732Z" level=error msg="Failed to destroy network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.050187 containerd[1738]: time="2025-03-17T18:02:59.050149037Z" level=error msg="encountered an error cleaning up failed sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.050308 containerd[1738]: time="2025-03-17T18:02:59.050241238Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:9,} failed, error" error="failed to setup network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.050653 containerd[1738]: time="2025-03-17T18:02:59.050411141Z" level=error msg="Failed to destroy network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.050732 kubelet[2614]: E0317 18:02:59.050675 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.050827 kubelet[2614]: E0317 18:02:59.050733 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:59.050827 kubelet[2614]: E0317 18:02:59.050803 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-7fcdb87857-lzds7" Mar 17 18:02:59.050927 kubelet[2614]: E0317 18:02:59.050853 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-7fcdb87857-lzds7_default(7c8e6616-65fb-4a59-beb7-79a79cf32e0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-7fcdb87857-lzds7" podUID="7c8e6616-65fb-4a59-beb7-79a79cf32e0c" Mar 17 18:02:59.051534 containerd[1738]: time="2025-03-17T18:02:59.051491957Z" level=error msg="encountered an error cleaning up failed sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.051631 containerd[1738]: time="2025-03-17T18:02:59.051556958Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:9,} failed, error" error="failed to setup network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.051761 kubelet[2614]: E0317 18:02:59.051728 2614 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:02:59.051829 kubelet[2614]: E0317 18:02:59.051784 2614 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:59.051829 kubelet[2614]: E0317 18:02:59.051808 2614 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dqftv" Mar 17 18:02:59.051931 kubelet[2614]: E0317 18:02:59.051882 2614 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dqftv_calico-system(82ad5d43-e4e0-4c4f-8306-eb61af926aaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dqftv" podUID="82ad5d43-e4e0-4c4f-8306-eb61af926aaf" Mar 17 18:02:59.399492 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 18:02:59.399925 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 18:02:59.683898 kubelet[2614]: E0317 18:02:59.683758 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:02:59.900454 kubelet[2614]: I0317 18:02:59.900419 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a" Mar 17 18:02:59.904252 containerd[1738]: time="2025-03-17T18:02:59.901763239Z" level=info msg="StopPodSandbox for \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\"" Mar 17 18:02:59.904252 containerd[1738]: time="2025-03-17T18:02:59.902009543Z" level=info msg="Ensure that sandbox 49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a in task-service has been cleanup successfully" Mar 17 18:02:59.905043 containerd[1738]: time="2025-03-17T18:02:59.904770184Z" level=info msg="TearDown network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\" successfully" Mar 17 18:02:59.905043 containerd[1738]: time="2025-03-17T18:02:59.904810384Z" level=info msg="StopPodSandbox for \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\" returns successfully" Mar 17 18:02:59.905243 containerd[1738]: time="2025-03-17T18:02:59.905082088Z" level=info msg="StopPodSandbox for \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\"" Mar 17 18:02:59.905243 containerd[1738]: time="2025-03-17T18:02:59.905166190Z" level=info msg="TearDown network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" successfully" Mar 17 18:02:59.905243 containerd[1738]: time="2025-03-17T18:02:59.905180190Z" level=info msg="StopPodSandbox for \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" returns successfully" Mar 17 18:02:59.905936 containerd[1738]: time="2025-03-17T18:02:59.905771799Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\"" Mar 17 18:02:59.905936 containerd[1738]: time="2025-03-17T18:02:59.905872300Z" level=info msg="TearDown network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" successfully" Mar 17 18:02:59.905936 containerd[1738]: time="2025-03-17T18:02:59.905888000Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" returns successfully" Mar 17 18:02:59.906292 containerd[1738]: time="2025-03-17T18:02:59.906267106Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" Mar 17 18:02:59.906378 containerd[1738]: time="2025-03-17T18:02:59.906356407Z" level=info msg="TearDown network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" successfully" Mar 17 18:02:59.906378 containerd[1738]: time="2025-03-17T18:02:59.906371608Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" returns successfully" Mar 17 18:02:59.906538 systemd[1]: run-netns-cni\x2d915296e1\x2d4c44\x2d3d04\x2d17d2\x2d5fe639f951ce.mount: Deactivated successfully. Mar 17 18:02:59.908509 kubelet[2614]: I0317 18:02:59.907479 2614 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2" Mar 17 18:02:59.908599 containerd[1738]: time="2025-03-17T18:02:59.907702127Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:02:59.908599 containerd[1738]: time="2025-03-17T18:02:59.907840129Z" level=info msg="TearDown network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" successfully" Mar 17 18:02:59.908599 containerd[1738]: time="2025-03-17T18:02:59.907875630Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" returns successfully" Mar 17 18:02:59.908599 containerd[1738]: time="2025-03-17T18:02:59.908228235Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:02:59.908599 containerd[1738]: time="2025-03-17T18:02:59.908305536Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:02:59.908599 containerd[1738]: time="2025-03-17T18:02:59.908316836Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:02:59.909645 containerd[1738]: time="2025-03-17T18:02:59.909369352Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:02:59.909645 containerd[1738]: time="2025-03-17T18:02:59.909453153Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:02:59.909645 containerd[1738]: time="2025-03-17T18:02:59.909465953Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:02:59.909873 containerd[1738]: time="2025-03-17T18:02:59.909821459Z" level=info msg="StopPodSandbox for \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\"" Mar 17 18:02:59.910084 containerd[1738]: time="2025-03-17T18:02:59.910034062Z" level=info msg="Ensure that sandbox 33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2 in task-service has been cleanup successfully" Mar 17 18:02:59.910491 containerd[1738]: time="2025-03-17T18:02:59.910211464Z" level=info msg="TearDown network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\" successfully" Mar 17 18:02:59.910491 containerd[1738]: time="2025-03-17T18:02:59.910229365Z" level=info msg="StopPodSandbox for \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\" returns successfully" Mar 17 18:02:59.913221 containerd[1738]: time="2025-03-17T18:02:59.912545799Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:02:59.913221 containerd[1738]: time="2025-03-17T18:02:59.912683601Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:02:59.913221 containerd[1738]: time="2025-03-17T18:02:59.912700501Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:02:59.913221 containerd[1738]: time="2025-03-17T18:02:59.912547999Z" level=info msg="StopPodSandbox for \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\"" Mar 17 18:02:59.913221 containerd[1738]: time="2025-03-17T18:02:59.912859204Z" level=info msg="TearDown network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" successfully" Mar 17 18:02:59.913221 containerd[1738]: time="2025-03-17T18:02:59.912872604Z" level=info msg="StopPodSandbox for \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" returns successfully" Mar 17 18:02:59.914991 containerd[1738]: time="2025-03-17T18:02:59.913733516Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:02:59.914991 containerd[1738]: time="2025-03-17T18:02:59.913817718Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:02:59.914991 containerd[1738]: time="2025-03-17T18:02:59.913832218Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:02:59.914991 containerd[1738]: time="2025-03-17T18:02:59.913877419Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\"" Mar 17 18:02:59.914991 containerd[1738]: time="2025-03-17T18:02:59.913978420Z" level=info msg="TearDown network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" successfully" Mar 17 18:02:59.914991 containerd[1738]: time="2025-03-17T18:02:59.913993820Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" returns successfully" Mar 17 18:02:59.913936 systemd[1]: run-netns-cni\x2d4d98bf78\x2d320b\x2de568\x2dfeba\x2d51bb0d4711b4.mount: Deactivated successfully. Mar 17 18:02:59.915850 containerd[1738]: time="2025-03-17T18:02:59.915466442Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" Mar 17 18:02:59.915850 containerd[1738]: time="2025-03-17T18:02:59.915554143Z" level=info msg="TearDown network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" successfully" Mar 17 18:02:59.915850 containerd[1738]: time="2025-03-17T18:02:59.915568344Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" returns successfully" Mar 17 18:02:59.915850 containerd[1738]: time="2025-03-17T18:02:59.915636445Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:02:59.915850 containerd[1738]: time="2025-03-17T18:02:59.915706946Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:02:59.915850 containerd[1738]: time="2025-03-17T18:02:59.915719946Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:02:59.916529 containerd[1738]: time="2025-03-17T18:02:59.916506258Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:02:59.916699 containerd[1738]: time="2025-03-17T18:02:59.916682360Z" level=info msg="TearDown network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" successfully" Mar 17 18:02:59.916773 containerd[1738]: time="2025-03-17T18:02:59.916760961Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" returns successfully" Mar 17 18:02:59.917314 containerd[1738]: time="2025-03-17T18:02:59.917278169Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:02:59.917889 containerd[1738]: time="2025-03-17T18:02:59.917432971Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:02:59.917889 containerd[1738]: time="2025-03-17T18:02:59.917452172Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:02:59.917889 containerd[1738]: time="2025-03-17T18:02:59.917485072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:10,}" Mar 17 18:02:59.919963 containerd[1738]: time="2025-03-17T18:02:59.919939108Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:02:59.920199 containerd[1738]: time="2025-03-17T18:02:59.920116811Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:02:59.920199 containerd[1738]: time="2025-03-17T18:02:59.920136711Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:02:59.920675 containerd[1738]: time="2025-03-17T18:02:59.920477816Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:02:59.920675 containerd[1738]: time="2025-03-17T18:02:59.920575818Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:02:59.920675 containerd[1738]: time="2025-03-17T18:02:59.920590018Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:02:59.921146 containerd[1738]: time="2025-03-17T18:02:59.920982924Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:02:59.921398 containerd[1738]: time="2025-03-17T18:02:59.921194327Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:02:59.921398 containerd[1738]: time="2025-03-17T18:02:59.921341329Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:02:59.921965 containerd[1738]: time="2025-03-17T18:02:59.921800236Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:02:59.922082 containerd[1738]: time="2025-03-17T18:02:59.922062240Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:02:59.922227 containerd[1738]: time="2025-03-17T18:02:59.922163041Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:02:59.923560 containerd[1738]: time="2025-03-17T18:02:59.923326958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:10,}" Mar 17 18:02:59.931369 kubelet[2614]: I0317 18:02:59.931289 2614 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9hhzz" podStartSLOduration=5.145637612 podStartE2EDuration="25.931245476s" podCreationTimestamp="2025-03-17 18:02:34 +0000 UTC" firstStartedPulling="2025-03-17 18:02:38.036371897 +0000 UTC m=+4.085604872" lastFinishedPulling="2025-03-17 18:02:58.821979761 +0000 UTC m=+24.871212736" observedRunningTime="2025-03-17 18:02:59.931194675 +0000 UTC m=+25.980427650" watchObservedRunningTime="2025-03-17 18:02:59.931245476 +0000 UTC m=+25.980478351" Mar 17 18:02:59.939624 systemd[1]: run-containerd-runc-k8s.io-8764cf00eaa658309ac432fa8d7452f2a45c5b9c0d80f28cc517601e149ce2c4-runc.2YVKL5.mount: Deactivated successfully. Mar 17 18:03:00.115477 systemd-networkd[1344]: calic532952020a: Link UP Mar 17 18:03:00.115698 systemd-networkd[1344]: calic532952020a: Gained carrier Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.004 [INFO][3828] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.021 [INFO][3828] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.200.4.11-k8s-csi--node--driver--dqftv-eth0 csi-node-driver- calico-system 82ad5d43-e4e0-4c4f-8306-eb61af926aaf 1390 0 2025-03-17 18:02:34 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s 10.200.4.11 csi-node-driver-dqftv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic532952020a [] []}} ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.021 [INFO][3828] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.058 [INFO][3842] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" HandleID="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Workload="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.071 [INFO][3842] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" HandleID="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Workload="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051bb0), Attrs:map[string]string{"namespace":"calico-system", "node":"10.200.4.11", "pod":"csi-node-driver-dqftv", "timestamp":"2025-03-17 18:03:00.058502859 +0000 UTC"}, Hostname:"10.200.4.11", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.071 [INFO][3842] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.071 [INFO][3842] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.071 [INFO][3842] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.200.4.11' Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.073 [INFO][3842] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.082 [INFO][3842] ipam/ipam.go 372: Looking up existing affinities for host host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.085 [INFO][3842] ipam/ipam.go 489: Trying affinity for 192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.087 [INFO][3842] ipam/ipam.go 155: Attempting to load block cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.088 [INFO][3842] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.089 [INFO][3842] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.11.192/26 handle="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.090 [INFO][3842] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7 Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.097 [INFO][3842] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.11.192/26 handle="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.105 [INFO][3842] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.11.193/26] block=192.168.11.192/26 handle="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.105 [INFO][3842] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.11.193/26] handle="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" host="10.200.4.11" Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.105 [INFO][3842] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:03:00.126669 containerd[1738]: 2025-03-17 18:03:00.105 [INFO][3842] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.11.193/26] IPv6=[] ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" HandleID="k8s-pod-network.c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Workload="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" Mar 17 18:03:00.127879 containerd[1738]: 2025-03-17 18:03:00.107 [INFO][3828] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-csi--node--driver--dqftv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"82ad5d43-e4e0-4c4f-8306-eb61af926aaf", ResourceVersion:"1390", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 2, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"", Pod:"csi-node-driver-dqftv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.11.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic532952020a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:00.127879 containerd[1738]: 2025-03-17 18:03:00.107 [INFO][3828] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.11.193/32] ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" Mar 17 18:03:00.127879 containerd[1738]: 2025-03-17 18:03:00.107 [INFO][3828] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic532952020a ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" Mar 17 18:03:00.127879 containerd[1738]: 2025-03-17 18:03:00.114 [INFO][3828] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" Mar 17 18:03:00.127879 containerd[1738]: 2025-03-17 18:03:00.114 [INFO][3828] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-csi--node--driver--dqftv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"82ad5d43-e4e0-4c4f-8306-eb61af926aaf", ResourceVersion:"1390", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 2, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7", Pod:"csi-node-driver-dqftv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.11.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic532952020a", MAC:"86:d8:85:36:20:ca", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:00.127879 containerd[1738]: 2025-03-17 18:03:00.125 [INFO][3828] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7" Namespace="calico-system" Pod="csi-node-driver-dqftv" WorkloadEndpoint="10.200.4.11-k8s-csi--node--driver--dqftv-eth0" Mar 17 18:03:00.148033 containerd[1738]: time="2025-03-17T18:03:00.147807480Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:03:00.148178 containerd[1738]: time="2025-03-17T18:03:00.147933482Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:03:00.148178 containerd[1738]: time="2025-03-17T18:03:00.147965683Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:00.148799 containerd[1738]: time="2025-03-17T18:03:00.148738294Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:00.169375 systemd[1]: Started cri-containerd-c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7.scope - libcontainer container c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7. Mar 17 18:03:00.188793 containerd[1738]: time="2025-03-17T18:03:00.188766786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dqftv,Uid:82ad5d43-e4e0-4c4f-8306-eb61af926aaf,Namespace:calico-system,Attempt:10,} returns sandbox id \"c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7\"" Mar 17 18:03:00.190517 containerd[1738]: time="2025-03-17T18:03:00.190419211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 18:03:00.211613 systemd-networkd[1344]: califa29a57dee0: Link UP Mar 17 18:03:00.211823 systemd-networkd[1344]: califa29a57dee0: Gained carrier Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.010 [INFO][3817] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.024 [INFO][3817] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0 nginx-deployment-7fcdb87857- default 7c8e6616-65fb-4a59-beb7-79a79cf32e0c 1464 0 2025-03-17 18:02:49 +0000 UTC map[app:nginx pod-template-hash:7fcdb87857 projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 10.200.4.11 nginx-deployment-7fcdb87857-lzds7 eth0 default [] [] [kns.default ksa.default.default] califa29a57dee0 [] []}} ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.024 [INFO][3817] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.064 [INFO][3847] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" HandleID="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Workload="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.073 [INFO][3847] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" HandleID="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Workload="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000422af0), Attrs:map[string]string{"namespace":"default", "node":"10.200.4.11", "pod":"nginx-deployment-7fcdb87857-lzds7", "timestamp":"2025-03-17 18:03:00.064131142 +0000 UTC"}, Hostname:"10.200.4.11", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.074 [INFO][3847] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.105 [INFO][3847] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.105 [INFO][3847] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.200.4.11' Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.174 [INFO][3847] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.179 [INFO][3847] ipam/ipam.go 372: Looking up existing affinities for host host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.191 [INFO][3847] ipam/ipam.go 489: Trying affinity for 192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.193 [INFO][3847] ipam/ipam.go 155: Attempting to load block cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.195 [INFO][3847] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.195 [INFO][3847] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.11.192/26 handle="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.197 [INFO][3847] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1 Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.200 [INFO][3847] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.11.192/26 handle="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.207 [INFO][3847] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.11.194/26] block=192.168.11.192/26 handle="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.207 [INFO][3847] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.11.194/26] handle="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" host="10.200.4.11" Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.207 [INFO][3847] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:03:00.221172 containerd[1738]: 2025-03-17 18:03:00.208 [INFO][3847] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.11.194/26] IPv6=[] ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" HandleID="k8s-pod-network.03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Workload="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" Mar 17 18:03:00.222395 containerd[1738]: 2025-03-17 18:03:00.209 [INFO][3817] cni-plugin/k8s.go 386: Populated endpoint ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0", GenerateName:"nginx-deployment-7fcdb87857-", Namespace:"default", SelfLink:"", UID:"7c8e6616-65fb-4a59-beb7-79a79cf32e0c", ResourceVersion:"1464", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 2, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"7fcdb87857", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"", Pod:"nginx-deployment-7fcdb87857-lzds7", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.11.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"califa29a57dee0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:00.222395 containerd[1738]: 2025-03-17 18:03:00.209 [INFO][3817] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.11.194/32] ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" Mar 17 18:03:00.222395 containerd[1738]: 2025-03-17 18:03:00.209 [INFO][3817] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califa29a57dee0 ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" Mar 17 18:03:00.222395 containerd[1738]: 2025-03-17 18:03:00.211 [INFO][3817] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" Mar 17 18:03:00.222395 containerd[1738]: 2025-03-17 18:03:00.211 [INFO][3817] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0", GenerateName:"nginx-deployment-7fcdb87857-", Namespace:"default", SelfLink:"", UID:"7c8e6616-65fb-4a59-beb7-79a79cf32e0c", ResourceVersion:"1464", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 2, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"7fcdb87857", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1", Pod:"nginx-deployment-7fcdb87857-lzds7", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.11.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"califa29a57dee0", MAC:"02:0c:31:2a:12:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:00.222395 containerd[1738]: 2025-03-17 18:03:00.219 [INFO][3817] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1" Namespace="default" Pod="nginx-deployment-7fcdb87857-lzds7" WorkloadEndpoint="10.200.4.11-k8s-nginx--deployment--7fcdb87857--lzds7-eth0" Mar 17 18:03:00.244728 containerd[1738]: time="2025-03-17T18:03:00.244477411Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:03:00.244728 containerd[1738]: time="2025-03-17T18:03:00.244537912Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:03:00.244728 containerd[1738]: time="2025-03-17T18:03:00.244559312Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:00.244728 containerd[1738]: time="2025-03-17T18:03:00.244652713Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:00.266375 systemd[1]: Started cri-containerd-03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1.scope - libcontainer container 03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1. Mar 17 18:03:00.303638 containerd[1738]: time="2025-03-17T18:03:00.303590286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-7fcdb87857-lzds7,Uid:7c8e6616-65fb-4a59-beb7-79a79cf32e0c,Namespace:default,Attempt:10,} returns sandbox id \"03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1\"" Mar 17 18:03:00.684330 kubelet[2614]: E0317 18:03:00.684281 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:00.946038 kernel: bpftool[4072]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 18:03:00.974988 systemd[1]: run-containerd-runc-k8s.io-8764cf00eaa658309ac432fa8d7452f2a45c5b9c0d80f28cc517601e149ce2c4-runc.lS7hwN.mount: Deactivated successfully. Mar 17 18:03:01.281914 systemd-networkd[1344]: vxlan.calico: Link UP Mar 17 18:03:01.281924 systemd-networkd[1344]: vxlan.calico: Gained carrier Mar 17 18:03:01.426451 systemd-networkd[1344]: calic532952020a: Gained IPv6LL Mar 17 18:03:01.554404 systemd-networkd[1344]: califa29a57dee0: Gained IPv6LL Mar 17 18:03:01.685099 kubelet[2614]: E0317 18:03:01.685044 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:02.404013 containerd[1738]: time="2025-03-17T18:03:02.403965267Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:02.406627 containerd[1738]: time="2025-03-17T18:03:02.406446204Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 17 18:03:02.411027 containerd[1738]: time="2025-03-17T18:03:02.410931170Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:02.414299 containerd[1738]: time="2025-03-17T18:03:02.414250220Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:02.414963 containerd[1738]: time="2025-03-17T18:03:02.414815928Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.224356017s" Mar 17 18:03:02.414963 containerd[1738]: time="2025-03-17T18:03:02.414849028Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 17 18:03:02.416723 containerd[1738]: time="2025-03-17T18:03:02.416467052Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Mar 17 18:03:02.417166 containerd[1738]: time="2025-03-17T18:03:02.417142062Z" level=info msg="CreateContainer within sandbox \"c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 18:03:02.454821 containerd[1738]: time="2025-03-17T18:03:02.454740119Z" level=info msg="CreateContainer within sandbox \"c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0c6bd8277c97611145f41c4cd9553c69ee0f5954715940b03c20c00cf8dee255\"" Mar 17 18:03:02.455596 containerd[1738]: time="2025-03-17T18:03:02.455440829Z" level=info msg="StartContainer for \"0c6bd8277c97611145f41c4cd9553c69ee0f5954715940b03c20c00cf8dee255\"" Mar 17 18:03:02.485362 systemd[1]: Started cri-containerd-0c6bd8277c97611145f41c4cd9553c69ee0f5954715940b03c20c00cf8dee255.scope - libcontainer container 0c6bd8277c97611145f41c4cd9553c69ee0f5954715940b03c20c00cf8dee255. Mar 17 18:03:02.513149 containerd[1738]: time="2025-03-17T18:03:02.513106382Z" level=info msg="StartContainer for \"0c6bd8277c97611145f41c4cd9553c69ee0f5954715940b03c20c00cf8dee255\" returns successfully" Mar 17 18:03:02.685636 kubelet[2614]: E0317 18:03:02.685502 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:02.770632 systemd-networkd[1344]: vxlan.calico: Gained IPv6LL Mar 17 18:03:03.685866 kubelet[2614]: E0317 18:03:03.685827 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:04.686586 kubelet[2614]: E0317 18:03:04.686530 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:05.371742 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4132351409.mount: Deactivated successfully. Mar 17 18:03:05.687289 kubelet[2614]: E0317 18:03:05.686988 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:06.592898 containerd[1738]: time="2025-03-17T18:03:06.592849368Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:06.595017 containerd[1738]: time="2025-03-17T18:03:06.594890999Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=73060131" Mar 17 18:03:06.598856 containerd[1738]: time="2025-03-17T18:03:06.598744457Z" level=info msg="ImageCreate event name:\"sha256:d25119ebd2aadc346788ac84ae0c5b1b018c687dcfd3167bb27e341f8b5caeee\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:06.605517 containerd[1738]: time="2025-03-17T18:03:06.605465259Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx@sha256:b927c62cc716b99bce51774b46a63feb63f5414c6f985fb80cacd1933bbd0e06\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:06.606524 containerd[1738]: time="2025-03-17T18:03:06.606367073Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:d25119ebd2aadc346788ac84ae0c5b1b018c687dcfd3167bb27e341f8b5caeee\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:b927c62cc716b99bce51774b46a63feb63f5414c6f985fb80cacd1933bbd0e06\", size \"73060009\" in 4.18987012s" Mar 17 18:03:06.606524 containerd[1738]: time="2025-03-17T18:03:06.606407773Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:d25119ebd2aadc346788ac84ae0c5b1b018c687dcfd3167bb27e341f8b5caeee\"" Mar 17 18:03:06.608297 containerd[1738]: time="2025-03-17T18:03:06.608166800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 18:03:06.608943 containerd[1738]: time="2025-03-17T18:03:06.608915311Z" level=info msg="CreateContainer within sandbox \"03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1\" for container &ContainerMetadata{Name:nginx,Attempt:0,}" Mar 17 18:03:06.643896 containerd[1738]: time="2025-03-17T18:03:06.643858041Z" level=info msg="CreateContainer within sandbox \"03c3c911e439ceb9ddb0633c0663626e2d488764342a6c2aec6d294a922a6ff1\" for &ContainerMetadata{Name:nginx,Attempt:0,} returns container id \"243351a6ab971c6499550d910696e0556f86308a59471d250eb9874d845b26c5\"" Mar 17 18:03:06.644468 containerd[1738]: time="2025-03-17T18:03:06.644436450Z" level=info msg="StartContainer for \"243351a6ab971c6499550d910696e0556f86308a59471d250eb9874d845b26c5\"" Mar 17 18:03:06.676344 systemd[1]: Started cri-containerd-243351a6ab971c6499550d910696e0556f86308a59471d250eb9874d845b26c5.scope - libcontainer container 243351a6ab971c6499550d910696e0556f86308a59471d250eb9874d845b26c5. Mar 17 18:03:06.688099 kubelet[2614]: E0317 18:03:06.688064 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:06.706074 containerd[1738]: time="2025-03-17T18:03:06.705977184Z" level=info msg="StartContainer for \"243351a6ab971c6499550d910696e0556f86308a59471d250eb9874d845b26c5\" returns successfully" Mar 17 18:03:06.963572 kubelet[2614]: I0317 18:03:06.963146 2614 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nginx-deployment-7fcdb87857-lzds7" podStartSLOduration=11.659937692 podStartE2EDuration="17.963131685s" podCreationTimestamp="2025-03-17 18:02:49 +0000 UTC" firstStartedPulling="2025-03-17 18:03:00.304407398 +0000 UTC m=+26.353640273" lastFinishedPulling="2025-03-17 18:03:06.607601391 +0000 UTC m=+32.656834266" observedRunningTime="2025-03-17 18:03:06.963003883 +0000 UTC m=+33.012236858" watchObservedRunningTime="2025-03-17 18:03:06.963131685 +0000 UTC m=+33.012364560" Mar 17 18:03:07.688385 kubelet[2614]: E0317 18:03:07.688322 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:08.688661 kubelet[2614]: E0317 18:03:08.688617 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:08.707756 containerd[1738]: time="2025-03-17T18:03:08.707705250Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:08.710311 containerd[1738]: time="2025-03-17T18:03:08.710231789Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 17 18:03:08.713783 containerd[1738]: time="2025-03-17T18:03:08.713727742Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:08.718899 containerd[1738]: time="2025-03-17T18:03:08.718844519Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:08.719931 containerd[1738]: time="2025-03-17T18:03:08.719501829Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.111297029s" Mar 17 18:03:08.719931 containerd[1738]: time="2025-03-17T18:03:08.719540530Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 17 18:03:08.721568 containerd[1738]: time="2025-03-17T18:03:08.721532660Z" level=info msg="CreateContainer within sandbox \"c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 18:03:08.756510 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2743147675.mount: Deactivated successfully. Mar 17 18:03:08.762398 containerd[1738]: time="2025-03-17T18:03:08.762366580Z" level=info msg="CreateContainer within sandbox \"c47a7c91871ef4ae6583c4de3a1b95edcecaf00acbf60408469242a14b795ca7\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"73e9d287f5933c1f4775e7e2abbe962df5eaf53099f4ffa8ddb876baef85c30b\"" Mar 17 18:03:08.762943 containerd[1738]: time="2025-03-17T18:03:08.762883887Z" level=info msg="StartContainer for \"73e9d287f5933c1f4775e7e2abbe962df5eaf53099f4ffa8ddb876baef85c30b\"" Mar 17 18:03:08.794367 systemd[1]: Started cri-containerd-73e9d287f5933c1f4775e7e2abbe962df5eaf53099f4ffa8ddb876baef85c30b.scope - libcontainer container 73e9d287f5933c1f4775e7e2abbe962df5eaf53099f4ffa8ddb876baef85c30b. Mar 17 18:03:08.822982 containerd[1738]: time="2025-03-17T18:03:08.822929398Z" level=info msg="StartContainer for \"73e9d287f5933c1f4775e7e2abbe962df5eaf53099f4ffa8ddb876baef85c30b\" returns successfully" Mar 17 18:03:09.688995 kubelet[2614]: E0317 18:03:09.688935 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:09.777770 kubelet[2614]: I0317 18:03:09.777737 2614 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 18:03:09.777770 kubelet[2614]: I0317 18:03:09.777776 2614 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 18:03:10.689353 kubelet[2614]: E0317 18:03:10.689292 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:11.690147 kubelet[2614]: E0317 18:03:11.690079 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:12.691037 kubelet[2614]: E0317 18:03:12.690976 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:12.954595 kubelet[2614]: I0317 18:03:12.954462 2614 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dqftv" podStartSLOduration=30.424133737 podStartE2EDuration="38.954443474s" podCreationTimestamp="2025-03-17 18:02:34 +0000 UTC" firstStartedPulling="2025-03-17 18:03:00.190018305 +0000 UTC m=+26.239251180" lastFinishedPulling="2025-03-17 18:03:08.720328042 +0000 UTC m=+34.769560917" observedRunningTime="2025-03-17 18:03:08.97528301 +0000 UTC m=+35.024515885" watchObservedRunningTime="2025-03-17 18:03:12.954443474 +0000 UTC m=+39.003676349" Mar 17 18:03:12.960292 systemd[1]: Created slice kubepods-besteffort-pod07f8106d_e397_452a_a75c_ed756bed2bee.slice - libcontainer container kubepods-besteffort-pod07f8106d_e397_452a_a75c_ed756bed2bee.slice. Mar 17 18:03:13.020555 kubelet[2614]: I0317 18:03:13.020466 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ssll\" (UniqueName: \"kubernetes.io/projected/07f8106d-e397-452a-a75c-ed756bed2bee-kube-api-access-6ssll\") pod \"nfs-server-provisioner-0\" (UID: \"07f8106d-e397-452a-a75c-ed756bed2bee\") " pod="default/nfs-server-provisioner-0" Mar 17 18:03:13.020724 kubelet[2614]: I0317 18:03:13.020578 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/07f8106d-e397-452a-a75c-ed756bed2bee-data\") pod \"nfs-server-provisioner-0\" (UID: \"07f8106d-e397-452a-a75c-ed756bed2bee\") " pod="default/nfs-server-provisioner-0" Mar 17 18:03:13.264030 containerd[1738]: time="2025-03-17T18:03:13.263888169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:07f8106d-e397-452a-a75c-ed756bed2bee,Namespace:default,Attempt:0,}" Mar 17 18:03:13.407780 systemd-networkd[1344]: cali60e51b789ff: Link UP Mar 17 18:03:13.408045 systemd-networkd[1344]: cali60e51b789ff: Gained carrier Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.345 [INFO][4355] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.200.4.11-k8s-nfs--server--provisioner--0-eth0 nfs-server-provisioner- default 07f8106d-e397-452a-a75c-ed756bed2bee 1610 0 2025-03-17 18:03:12 +0000 UTC map[app:nfs-server-provisioner apps.kubernetes.io/pod-index:0 chart:nfs-server-provisioner-1.8.0 controller-revision-hash:nfs-server-provisioner-d5cbb7f57 heritage:Helm projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:nfs-server-provisioner release:nfs-server-provisioner statefulset.kubernetes.io/pod-name:nfs-server-provisioner-0] map[] [] [] []} {k8s 10.200.4.11 nfs-server-provisioner-0 eth0 nfs-server-provisioner [] [] [kns.default ksa.default.nfs-server-provisioner] cali60e51b789ff [{nfs TCP 2049 0 } {nfs-udp UDP 2049 0 } {nlockmgr TCP 32803 0 } {nlockmgr-udp UDP 32803 0 } {mountd TCP 20048 0 } {mountd-udp UDP 20048 0 } {rquotad TCP 875 0 } {rquotad-udp UDP 875 0 } {rpcbind TCP 111 0 } {rpcbind-udp UDP 111 0 } {statd TCP 662 0 } {statd-udp UDP 662 0 }] []}} ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.345 [INFO][4355] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.368 [INFO][4367] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" HandleID="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Workload="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.378 [INFO][4367] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" HandleID="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Workload="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319790), Attrs:map[string]string{"namespace":"default", "node":"10.200.4.11", "pod":"nfs-server-provisioner-0", "timestamp":"2025-03-17 18:03:13.368186451 +0000 UTC"}, Hostname:"10.200.4.11", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.378 [INFO][4367] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.378 [INFO][4367] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.378 [INFO][4367] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.200.4.11' Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.380 [INFO][4367] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.383 [INFO][4367] ipam/ipam.go 372: Looking up existing affinities for host host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.387 [INFO][4367] ipam/ipam.go 489: Trying affinity for 192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.388 [INFO][4367] ipam/ipam.go 155: Attempting to load block cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.390 [INFO][4367] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.390 [INFO][4367] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.11.192/26 handle="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.391 [INFO][4367] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048 Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.397 [INFO][4367] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.11.192/26 handle="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.402 [INFO][4367] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.11.195/26] block=192.168.11.192/26 handle="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.402 [INFO][4367] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.11.195/26] handle="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" host="10.200.4.11" Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.402 [INFO][4367] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:03:13.419229 containerd[1738]: 2025-03-17 18:03:13.402 [INFO][4367] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.11.195/26] IPv6=[] ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" HandleID="k8s-pod-network.98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Workload="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" Mar 17 18:03:13.420397 containerd[1738]: 2025-03-17 18:03:13.404 [INFO][4355] cni-plugin/k8s.go 386: Populated endpoint ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"07f8106d-e397-452a-a75c-ed756bed2bee", ResourceVersion:"1610", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 3, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.11.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:13.420397 containerd[1738]: 2025-03-17 18:03:13.404 [INFO][4355] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.11.195/32] ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" Mar 17 18:03:13.420397 containerd[1738]: 2025-03-17 18:03:13.404 [INFO][4355] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60e51b789ff ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" Mar 17 18:03:13.420397 containerd[1738]: 2025-03-17 18:03:13.407 [INFO][4355] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" Mar 17 18:03:13.420741 containerd[1738]: 2025-03-17 18:03:13.408 [INFO][4355] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"07f8106d-e397-452a-a75c-ed756bed2bee", ResourceVersion:"1610", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 3, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.11.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"22:90:dd:61:4f:05", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:13.420741 containerd[1738]: 2025-03-17 18:03:13.417 [INFO][4355] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="10.200.4.11-k8s-nfs--server--provisioner--0-eth0" Mar 17 18:03:13.445644 containerd[1738]: time="2025-03-17T18:03:13.445554024Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:03:13.445897 containerd[1738]: time="2025-03-17T18:03:13.445677726Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:03:13.445897 containerd[1738]: time="2025-03-17T18:03:13.445773428Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:13.446124 containerd[1738]: time="2025-03-17T18:03:13.446007231Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:13.479354 systemd[1]: Started cri-containerd-98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048.scope - libcontainer container 98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048. Mar 17 18:03:13.517118 containerd[1738]: time="2025-03-17T18:03:13.517015009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:07f8106d-e397-452a-a75c-ed756bed2bee,Namespace:default,Attempt:0,} returns sandbox id \"98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048\"" Mar 17 18:03:13.519080 containerd[1738]: time="2025-03-17T18:03:13.519055540Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\"" Mar 17 18:03:13.691181 kubelet[2614]: E0317 18:03:13.691135 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:14.132704 systemd[1]: run-containerd-runc-k8s.io-98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048-runc.YslnTL.mount: Deactivated successfully. Mar 17 18:03:14.666807 kubelet[2614]: E0317 18:03:14.666732 2614 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:14.691355 kubelet[2614]: E0317 18:03:14.691276 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:15.252294 systemd-networkd[1344]: cali60e51b789ff: Gained IPv6LL Mar 17 18:03:15.691615 kubelet[2614]: E0317 18:03:15.691569 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:16.061107 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2270103737.mount: Deactivated successfully. Mar 17 18:03:16.692375 kubelet[2614]: E0317 18:03:16.692332 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:17.692909 kubelet[2614]: E0317 18:03:17.692850 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:18.657493 containerd[1738]: time="2025-03-17T18:03:18.657443871Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:18.660602 containerd[1738]: time="2025-03-17T18:03:18.660517116Z" level=info msg="stop pulling image registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8: active requests=0, bytes read=91039414" Mar 17 18:03:18.663599 containerd[1738]: time="2025-03-17T18:03:18.663547060Z" level=info msg="ImageCreate event name:\"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:18.669367 containerd[1738]: time="2025-03-17T18:03:18.669331344Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:18.674227 containerd[1738]: time="2025-03-17T18:03:18.672518490Z" level=info msg="Pulled image \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" with image id \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\", repo tag \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\", repo digest \"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\", size \"91036984\" in 5.153336649s" Mar 17 18:03:18.674227 containerd[1738]: time="2025-03-17T18:03:18.672566391Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" returns image reference \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\"" Mar 17 18:03:18.678079 containerd[1738]: time="2025-03-17T18:03:18.678044671Z" level=info msg="CreateContainer within sandbox \"98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048\" for container &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,}" Mar 17 18:03:18.693399 kubelet[2614]: E0317 18:03:18.693369 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:18.712991 containerd[1738]: time="2025-03-17T18:03:18.712955978Z" level=info msg="CreateContainer within sandbox \"98e319cb767bf80884a4002bc473b5690a8d09b03a08b59687a04140b6a71048\" for &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,} returns container id \"0efedbc213734b164419feb9cf34bf9b87d65e7407694cd4a497466e9c52bf85\"" Mar 17 18:03:18.713483 containerd[1738]: time="2025-03-17T18:03:18.713437485Z" level=info msg="StartContainer for \"0efedbc213734b164419feb9cf34bf9b87d65e7407694cd4a497466e9c52bf85\"" Mar 17 18:03:18.744584 systemd[1]: run-containerd-runc-k8s.io-0efedbc213734b164419feb9cf34bf9b87d65e7407694cd4a497466e9c52bf85-runc.HdAZaJ.mount: Deactivated successfully. Mar 17 18:03:18.754365 systemd[1]: Started cri-containerd-0efedbc213734b164419feb9cf34bf9b87d65e7407694cd4a497466e9c52bf85.scope - libcontainer container 0efedbc213734b164419feb9cf34bf9b87d65e7407694cd4a497466e9c52bf85. Mar 17 18:03:18.781536 containerd[1738]: time="2025-03-17T18:03:18.781408872Z" level=info msg="StartContainer for \"0efedbc213734b164419feb9cf34bf9b87d65e7407694cd4a497466e9c52bf85\" returns successfully" Mar 17 18:03:18.998857 kubelet[2614]: I0317 18:03:18.998698 2614 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/nfs-server-provisioner-0" podStartSLOduration=1.8409990139999999 podStartE2EDuration="6.998682527s" podCreationTimestamp="2025-03-17 18:03:12 +0000 UTC" firstStartedPulling="2025-03-17 18:03:13.518719234 +0000 UTC m=+39.567952109" lastFinishedPulling="2025-03-17 18:03:18.676402647 +0000 UTC m=+44.725635622" observedRunningTime="2025-03-17 18:03:18.998379923 +0000 UTC m=+45.047612898" watchObservedRunningTime="2025-03-17 18:03:18.998682527 +0000 UTC m=+45.047915802" Mar 17 18:03:19.693768 kubelet[2614]: E0317 18:03:19.693702 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:20.694216 kubelet[2614]: E0317 18:03:20.694139 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:21.695274 kubelet[2614]: E0317 18:03:21.695199 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:22.695969 kubelet[2614]: E0317 18:03:22.695911 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:23.697127 kubelet[2614]: E0317 18:03:23.697021 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:24.698092 kubelet[2614]: E0317 18:03:24.698036 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:25.698648 kubelet[2614]: E0317 18:03:25.698557 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:26.699771 kubelet[2614]: E0317 18:03:26.699672 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:27.700081 kubelet[2614]: E0317 18:03:27.700020 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:28.701324 kubelet[2614]: E0317 18:03:28.701196 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:29.702198 kubelet[2614]: E0317 18:03:29.702102 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:29.867913 waagent[1941]: 2025-03-17T18:03:29.867824Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 2] Mar 17 18:03:29.877181 waagent[1941]: 2025-03-17T18:03:29.877122Z INFO ExtHandler Mar 17 18:03:29.877308 waagent[1941]: 2025-03-17T18:03:29.877259Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: de733ed0-018b-4459-b522-645c8c4d0728 eTag: 13955930823355648215 source: Fabric] Mar 17 18:03:29.877652 waagent[1941]: 2025-03-17T18:03:29.877596Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 18:03:29.878220 waagent[1941]: 2025-03-17T18:03:29.878161Z INFO ExtHandler Mar 17 18:03:29.878302 waagent[1941]: 2025-03-17T18:03:29.878270Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 2] Mar 17 18:03:29.881872 waagent[1941]: 2025-03-17T18:03:29.881829Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 18:03:29.943533 waagent[1941]: 2025-03-17T18:03:29.943463Z INFO ExtHandler Downloaded certificate {'thumbprint': '8744BB52BA684F87321E6E6C3FBBEF8E6C239273', 'hasPrivateKey': True} Mar 17 18:03:29.943983 waagent[1941]: 2025-03-17T18:03:29.943929Z INFO ExtHandler Fetch goal state completed Mar 17 18:03:29.944378 waagent[1941]: 2025-03-17T18:03:29.944331Z INFO ExtHandler ExtHandler Mar 17 18:03:29.944455 waagent[1941]: 2025-03-17T18:03:29.944424Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_2 channel: WireServer source: Fabric activity: c1728bfb-2133-46be-82ef-8f5322c29cfe correlation bfee555b-a980-491e-aa0c-818f1c1e6bc8 created: 2025-03-17T18:03:22.472931Z] Mar 17 18:03:29.944818 waagent[1941]: 2025-03-17T18:03:29.944769Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 18:03:29.945320 waagent[1941]: 2025-03-17T18:03:29.945275Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_2 0 ms] Mar 17 18:03:30.702957 kubelet[2614]: E0317 18:03:30.702893 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:31.703924 kubelet[2614]: E0317 18:03:31.703807 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:32.705045 kubelet[2614]: E0317 18:03:32.704953 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:33.705430 kubelet[2614]: E0317 18:03:33.705373 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:34.666065 kubelet[2614]: E0317 18:03:34.666002 2614 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:34.693151 containerd[1738]: time="2025-03-17T18:03:34.693110878Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:03:34.693822 containerd[1738]: time="2025-03-17T18:03:34.693262580Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:03:34.693822 containerd[1738]: time="2025-03-17T18:03:34.693282280Z" level=info msg="StopPodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:03:34.693822 containerd[1738]: time="2025-03-17T18:03:34.693671286Z" level=info msg="RemovePodSandbox for \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:03:34.693822 containerd[1738]: time="2025-03-17T18:03:34.693697286Z" level=info msg="Forcibly stopping sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\"" Mar 17 18:03:34.693822 containerd[1738]: time="2025-03-17T18:03:34.693767287Z" level=info msg="TearDown network for sandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" successfully" Mar 17 18:03:34.699091 containerd[1738]: time="2025-03-17T18:03:34.699055362Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.699193 containerd[1738]: time="2025-03-17T18:03:34.699101663Z" level=info msg="RemovePodSandbox \"c5a979e96465a3d8995d43f95e14bddebe64ebc9a42fb177a1db2399a20b4f22\" returns successfully" Mar 17 18:03:34.699455 containerd[1738]: time="2025-03-17T18:03:34.699422568Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:03:34.699540 containerd[1738]: time="2025-03-17T18:03:34.699506669Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:03:34.699540 containerd[1738]: time="2025-03-17T18:03:34.699520969Z" level=info msg="StopPodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:03:34.699830 containerd[1738]: time="2025-03-17T18:03:34.699800173Z" level=info msg="RemovePodSandbox for \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:03:34.699903 containerd[1738]: time="2025-03-17T18:03:34.699827173Z" level=info msg="Forcibly stopping sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\"" Mar 17 18:03:34.699960 containerd[1738]: time="2025-03-17T18:03:34.699914975Z" level=info msg="TearDown network for sandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" successfully" Mar 17 18:03:34.705832 containerd[1738]: time="2025-03-17T18:03:34.705803259Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.705950 containerd[1738]: time="2025-03-17T18:03:34.705841959Z" level=info msg="RemovePodSandbox \"52435524e2d57f0ed9c42b8d3bdf434d2ffbc47d3839a667508420a861bc0dfd\" returns successfully" Mar 17 18:03:34.706079 kubelet[2614]: E0317 18:03:34.706047 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:34.706424 containerd[1738]: time="2025-03-17T18:03:34.706124063Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:03:34.706424 containerd[1738]: time="2025-03-17T18:03:34.706233465Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:03:34.706424 containerd[1738]: time="2025-03-17T18:03:34.706282065Z" level=info msg="StopPodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:03:34.706786 containerd[1738]: time="2025-03-17T18:03:34.706724372Z" level=info msg="RemovePodSandbox for \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:03:34.706786 containerd[1738]: time="2025-03-17T18:03:34.706754072Z" level=info msg="Forcibly stopping sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\"" Mar 17 18:03:34.706921 containerd[1738]: time="2025-03-17T18:03:34.706827073Z" level=info msg="TearDown network for sandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" successfully" Mar 17 18:03:34.711658 containerd[1738]: time="2025-03-17T18:03:34.711630942Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.711751 containerd[1738]: time="2025-03-17T18:03:34.711668842Z" level=info msg="RemovePodSandbox \"f4488d3614c4e418129e8fe0c3227ea57936d22d0cd8a8632f84e4e5a0bcd4ba\" returns successfully" Mar 17 18:03:34.712024 containerd[1738]: time="2025-03-17T18:03:34.711933246Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:03:34.712149 containerd[1738]: time="2025-03-17T18:03:34.712021047Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:03:34.712149 containerd[1738]: time="2025-03-17T18:03:34.712035647Z" level=info msg="StopPodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:03:34.712359 containerd[1738]: time="2025-03-17T18:03:34.712282051Z" level=info msg="RemovePodSandbox for \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:03:34.712359 containerd[1738]: time="2025-03-17T18:03:34.712308251Z" level=info msg="Forcibly stopping sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\"" Mar 17 18:03:34.712465 containerd[1738]: time="2025-03-17T18:03:34.712380352Z" level=info msg="TearDown network for sandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" successfully" Mar 17 18:03:34.718899 containerd[1738]: time="2025-03-17T18:03:34.718863945Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.719168 containerd[1738]: time="2025-03-17T18:03:34.719036747Z" level=info msg="RemovePodSandbox \"dfbc8d8ae21608e0cb3d4bbc130da99d00dd7cda887c094dade802cce1ab0d08\" returns successfully" Mar 17 18:03:34.719383 containerd[1738]: time="2025-03-17T18:03:34.719326351Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:03:34.719501 containerd[1738]: time="2025-03-17T18:03:34.719414152Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:03:34.719501 containerd[1738]: time="2025-03-17T18:03:34.719430253Z" level=info msg="StopPodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:03:34.719730 containerd[1738]: time="2025-03-17T18:03:34.719706557Z" level=info msg="RemovePodSandbox for \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:03:34.719800 containerd[1738]: time="2025-03-17T18:03:34.719737157Z" level=info msg="Forcibly stopping sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\"" Mar 17 18:03:34.719846 containerd[1738]: time="2025-03-17T18:03:34.719808458Z" level=info msg="TearDown network for sandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" successfully" Mar 17 18:03:34.725991 containerd[1738]: time="2025-03-17T18:03:34.725965446Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.726153 containerd[1738]: time="2025-03-17T18:03:34.726003146Z" level=info msg="RemovePodSandbox \"d21e06ded1fd2f817255f56b4f7083c0198f5b45bd3ebc7f5a9c91d906d71e4e\" returns successfully" Mar 17 18:03:34.728222 containerd[1738]: time="2025-03-17T18:03:34.726517554Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:03:34.728222 containerd[1738]: time="2025-03-17T18:03:34.726635355Z" level=info msg="TearDown network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" successfully" Mar 17 18:03:34.728222 containerd[1738]: time="2025-03-17T18:03:34.726655056Z" level=info msg="StopPodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" returns successfully" Mar 17 18:03:34.729123 containerd[1738]: time="2025-03-17T18:03:34.729081690Z" level=info msg="RemovePodSandbox for \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:03:34.729197 containerd[1738]: time="2025-03-17T18:03:34.729129891Z" level=info msg="Forcibly stopping sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\"" Mar 17 18:03:34.729291 containerd[1738]: time="2025-03-17T18:03:34.729251293Z" level=info msg="TearDown network for sandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" successfully" Mar 17 18:03:34.739311 containerd[1738]: time="2025-03-17T18:03:34.739279335Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.739473 containerd[1738]: time="2025-03-17T18:03:34.739439638Z" level=info msg="RemovePodSandbox \"41c11ef50f48d69b671fb84a26382c1e8694aba0a09f87382a23343576996287\" returns successfully" Mar 17 18:03:34.739915 containerd[1738]: time="2025-03-17T18:03:34.739881944Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" Mar 17 18:03:34.740199 containerd[1738]: time="2025-03-17T18:03:34.740162748Z" level=info msg="TearDown network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" successfully" Mar 17 18:03:34.740325 containerd[1738]: time="2025-03-17T18:03:34.740298150Z" level=info msg="StopPodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" returns successfully" Mar 17 18:03:34.740648 containerd[1738]: time="2025-03-17T18:03:34.740622755Z" level=info msg="RemovePodSandbox for \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" Mar 17 18:03:34.740648 containerd[1738]: time="2025-03-17T18:03:34.740645855Z" level=info msg="Forcibly stopping sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\"" Mar 17 18:03:34.740760 containerd[1738]: time="2025-03-17T18:03:34.740713156Z" level=info msg="TearDown network for sandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" successfully" Mar 17 18:03:34.749451 containerd[1738]: time="2025-03-17T18:03:34.749424080Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.749558 containerd[1738]: time="2025-03-17T18:03:34.749460681Z" level=info msg="RemovePodSandbox \"a4f3de5f1ddc2295be2dedf77ecdb8523fa586a596f1c0a933e758f044078e31\" returns successfully" Mar 17 18:03:34.750290 containerd[1738]: time="2025-03-17T18:03:34.750048389Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\"" Mar 17 18:03:34.750290 containerd[1738]: time="2025-03-17T18:03:34.750132090Z" level=info msg="TearDown network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" successfully" Mar 17 18:03:34.750290 containerd[1738]: time="2025-03-17T18:03:34.750175591Z" level=info msg="StopPodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" returns successfully" Mar 17 18:03:34.752497 containerd[1738]: time="2025-03-17T18:03:34.752480424Z" level=info msg="RemovePodSandbox for \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\"" Mar 17 18:03:34.752690 containerd[1738]: time="2025-03-17T18:03:34.752565125Z" level=info msg="Forcibly stopping sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\"" Mar 17 18:03:34.752690 containerd[1738]: time="2025-03-17T18:03:34.752632426Z" level=info msg="TearDown network for sandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" successfully" Mar 17 18:03:34.760081 containerd[1738]: time="2025-03-17T18:03:34.760053431Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.760177 containerd[1738]: time="2025-03-17T18:03:34.760089432Z" level=info msg="RemovePodSandbox \"5925b1c054cddf884693148657c108e5ec43b447add1b17d3eeec40d72aca436\" returns successfully" Mar 17 18:03:34.760399 containerd[1738]: time="2025-03-17T18:03:34.760378036Z" level=info msg="StopPodSandbox for \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\"" Mar 17 18:03:34.760548 containerd[1738]: time="2025-03-17T18:03:34.760527638Z" level=info msg="TearDown network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" successfully" Mar 17 18:03:34.760548 containerd[1738]: time="2025-03-17T18:03:34.760544038Z" level=info msg="StopPodSandbox for \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" returns successfully" Mar 17 18:03:34.760860 containerd[1738]: time="2025-03-17T18:03:34.760834943Z" level=info msg="RemovePodSandbox for \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\"" Mar 17 18:03:34.760930 containerd[1738]: time="2025-03-17T18:03:34.760864543Z" level=info msg="Forcibly stopping sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\"" Mar 17 18:03:34.760985 containerd[1738]: time="2025-03-17T18:03:34.760931044Z" level=info msg="TearDown network for sandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" successfully" Mar 17 18:03:34.769576 containerd[1738]: time="2025-03-17T18:03:34.769529366Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.769655 containerd[1738]: time="2025-03-17T18:03:34.769584467Z" level=info msg="RemovePodSandbox \"1ac3db2fc30acbb7f5048bfd4c498f376c52d2ff80c3ebc46f17c30c3be32ba5\" returns successfully" Mar 17 18:03:34.769956 containerd[1738]: time="2025-03-17T18:03:34.769935572Z" level=info msg="StopPodSandbox for \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\"" Mar 17 18:03:34.770160 containerd[1738]: time="2025-03-17T18:03:34.770127675Z" level=info msg="TearDown network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\" successfully" Mar 17 18:03:34.770330 containerd[1738]: time="2025-03-17T18:03:34.770243777Z" level=info msg="StopPodSandbox for \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\" returns successfully" Mar 17 18:03:34.770548 containerd[1738]: time="2025-03-17T18:03:34.770527681Z" level=info msg="RemovePodSandbox for \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\"" Mar 17 18:03:34.770783 containerd[1738]: time="2025-03-17T18:03:34.770650882Z" level=info msg="Forcibly stopping sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\"" Mar 17 18:03:34.770783 containerd[1738]: time="2025-03-17T18:03:34.770727684Z" level=info msg="TearDown network for sandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\" successfully" Mar 17 18:03:34.778055 containerd[1738]: time="2025-03-17T18:03:34.778029388Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.778163 containerd[1738]: time="2025-03-17T18:03:34.778064988Z" level=info msg="RemovePodSandbox \"33a03848db4414b92739870db71abc37db276dfbe7b6d1c161e00fdf3fdeecb2\" returns successfully" Mar 17 18:03:34.778380 containerd[1738]: time="2025-03-17T18:03:34.778331592Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:03:34.778472 containerd[1738]: time="2025-03-17T18:03:34.778419493Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:03:34.778472 containerd[1738]: time="2025-03-17T18:03:34.778434693Z" level=info msg="StopPodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:03:34.778773 containerd[1738]: time="2025-03-17T18:03:34.778691097Z" level=info msg="RemovePodSandbox for \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:03:34.778773 containerd[1738]: time="2025-03-17T18:03:34.778716597Z" level=info msg="Forcibly stopping sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\"" Mar 17 18:03:34.778892 containerd[1738]: time="2025-03-17T18:03:34.778787398Z" level=info msg="TearDown network for sandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" successfully" Mar 17 18:03:34.787069 containerd[1738]: time="2025-03-17T18:03:34.786988315Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.787542 containerd[1738]: time="2025-03-17T18:03:34.787084317Z" level=info msg="RemovePodSandbox \"616a2bd94d9c81ae2c663d3f3b58a735a7754740d2476d50c05fba0cdd3398c0\" returns successfully" Mar 17 18:03:34.787685 containerd[1738]: time="2025-03-17T18:03:34.787660525Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:03:34.787764 containerd[1738]: time="2025-03-17T18:03:34.787751326Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:03:34.787810 containerd[1738]: time="2025-03-17T18:03:34.787765626Z" level=info msg="StopPodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:03:34.788106 containerd[1738]: time="2025-03-17T18:03:34.787997830Z" level=info msg="RemovePodSandbox for \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:03:34.788106 containerd[1738]: time="2025-03-17T18:03:34.788074331Z" level=info msg="Forcibly stopping sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\"" Mar 17 18:03:34.788246 containerd[1738]: time="2025-03-17T18:03:34.788160232Z" level=info msg="TearDown network for sandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" successfully" Mar 17 18:03:34.796201 containerd[1738]: time="2025-03-17T18:03:34.796174246Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.796292 containerd[1738]: time="2025-03-17T18:03:34.796225247Z" level=info msg="RemovePodSandbox \"8af1ad6fde1fe7d6bc16ec6de0d4cc5cde9f122359d7d8c939028453e8f3bdc4\" returns successfully" Mar 17 18:03:34.796600 containerd[1738]: time="2025-03-17T18:03:34.796537851Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:03:34.796680 containerd[1738]: time="2025-03-17T18:03:34.796629853Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:03:34.796680 containerd[1738]: time="2025-03-17T18:03:34.796644653Z" level=info msg="StopPodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:03:34.796989 containerd[1738]: time="2025-03-17T18:03:34.796910957Z" level=info msg="RemovePodSandbox for \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:03:34.796989 containerd[1738]: time="2025-03-17T18:03:34.796938457Z" level=info msg="Forcibly stopping sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\"" Mar 17 18:03:34.797157 containerd[1738]: time="2025-03-17T18:03:34.797043359Z" level=info msg="TearDown network for sandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" successfully" Mar 17 18:03:34.806007 containerd[1738]: time="2025-03-17T18:03:34.805978186Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.806113 containerd[1738]: time="2025-03-17T18:03:34.806054487Z" level=info msg="RemovePodSandbox \"27448eb49d0e145898546438106b8aae873192b188373a11a32a08f2fd65020c\" returns successfully" Mar 17 18:03:34.806473 containerd[1738]: time="2025-03-17T18:03:34.806437692Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:03:34.806587 containerd[1738]: time="2025-03-17T18:03:34.806527994Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:03:34.806587 containerd[1738]: time="2025-03-17T18:03:34.806581694Z" level=info msg="StopPodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:03:34.806897 containerd[1738]: time="2025-03-17T18:03:34.806874499Z" level=info msg="RemovePodSandbox for \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:03:34.806962 containerd[1738]: time="2025-03-17T18:03:34.806906199Z" level=info msg="Forcibly stopping sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\"" Mar 17 18:03:34.807083 containerd[1738]: time="2025-03-17T18:03:34.806976600Z" level=info msg="TearDown network for sandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" successfully" Mar 17 18:03:34.816399 containerd[1738]: time="2025-03-17T18:03:34.816373434Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.816509 containerd[1738]: time="2025-03-17T18:03:34.816413234Z" level=info msg="RemovePodSandbox \"a04a872d4f869a629e2ae3e62c1fcdbf90598a546b894fa203aa232b640b1a75\" returns successfully" Mar 17 18:03:34.816760 containerd[1738]: time="2025-03-17T18:03:34.816726339Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:03:34.816865 containerd[1738]: time="2025-03-17T18:03:34.816815140Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:03:34.816865 containerd[1738]: time="2025-03-17T18:03:34.816834040Z" level=info msg="StopPodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:03:34.817221 containerd[1738]: time="2025-03-17T18:03:34.817173245Z" level=info msg="RemovePodSandbox for \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:03:34.817298 containerd[1738]: time="2025-03-17T18:03:34.817202046Z" level=info msg="Forcibly stopping sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\"" Mar 17 18:03:34.817357 containerd[1738]: time="2025-03-17T18:03:34.817315047Z" level=info msg="TearDown network for sandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" successfully" Mar 17 18:03:34.826142 containerd[1738]: time="2025-03-17T18:03:34.826116173Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.826244 containerd[1738]: time="2025-03-17T18:03:34.826153773Z" level=info msg="RemovePodSandbox \"65afc69099c654b91dba3c92b14b8bb25bad025879dc085ee674162455cf4749\" returns successfully" Mar 17 18:03:34.826544 containerd[1738]: time="2025-03-17T18:03:34.826463678Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:03:34.826618 containerd[1738]: time="2025-03-17T18:03:34.826553679Z" level=info msg="TearDown network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" successfully" Mar 17 18:03:34.826618 containerd[1738]: time="2025-03-17T18:03:34.826567379Z" level=info msg="StopPodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" returns successfully" Mar 17 18:03:34.826846 containerd[1738]: time="2025-03-17T18:03:34.826824383Z" level=info msg="RemovePodSandbox for \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:03:34.826904 containerd[1738]: time="2025-03-17T18:03:34.826851683Z" level=info msg="Forcibly stopping sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\"" Mar 17 18:03:34.827038 containerd[1738]: time="2025-03-17T18:03:34.826923584Z" level=info msg="TearDown network for sandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" successfully" Mar 17 18:03:34.838323 containerd[1738]: time="2025-03-17T18:03:34.838296546Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.838579 containerd[1738]: time="2025-03-17T18:03:34.838441948Z" level=info msg="RemovePodSandbox \"dd30abbe6c61d449640948c42f0ebfa4b136c877b4dd8eeea05730176cc03540\" returns successfully" Mar 17 18:03:34.839172 containerd[1738]: time="2025-03-17T18:03:34.839124258Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" Mar 17 18:03:34.839648 containerd[1738]: time="2025-03-17T18:03:34.839409762Z" level=info msg="TearDown network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" successfully" Mar 17 18:03:34.839648 containerd[1738]: time="2025-03-17T18:03:34.839432762Z" level=info msg="StopPodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" returns successfully" Mar 17 18:03:34.841075 containerd[1738]: time="2025-03-17T18:03:34.839879669Z" level=info msg="RemovePodSandbox for \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" Mar 17 18:03:34.841075 containerd[1738]: time="2025-03-17T18:03:34.839908869Z" level=info msg="Forcibly stopping sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\"" Mar 17 18:03:34.841075 containerd[1738]: time="2025-03-17T18:03:34.839998371Z" level=info msg="TearDown network for sandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" successfully" Mar 17 18:03:34.849985 containerd[1738]: time="2025-03-17T18:03:34.849847911Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.849985 containerd[1738]: time="2025-03-17T18:03:34.849972913Z" level=info msg="RemovePodSandbox \"9fed94cf7ef01959f9fae9609e2db23179bd8eeb07fe9ef2e40a11c881769ee3\" returns successfully" Mar 17 18:03:34.850388 containerd[1738]: time="2025-03-17T18:03:34.850358318Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\"" Mar 17 18:03:34.850472 containerd[1738]: time="2025-03-17T18:03:34.850445719Z" level=info msg="TearDown network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" successfully" Mar 17 18:03:34.850472 containerd[1738]: time="2025-03-17T18:03:34.850461020Z" level=info msg="StopPodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" returns successfully" Mar 17 18:03:34.850746 containerd[1738]: time="2025-03-17T18:03:34.850723223Z" level=info msg="RemovePodSandbox for \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\"" Mar 17 18:03:34.850806 containerd[1738]: time="2025-03-17T18:03:34.850752924Z" level=info msg="Forcibly stopping sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\"" Mar 17 18:03:34.850921 containerd[1738]: time="2025-03-17T18:03:34.850854425Z" level=info msg="TearDown network for sandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" successfully" Mar 17 18:03:34.861485 containerd[1738]: time="2025-03-17T18:03:34.861410976Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.861743 containerd[1738]: time="2025-03-17T18:03:34.861520977Z" level=info msg="RemovePodSandbox \"6a91b3dd10ac263b782517ef9388538c8769c9c6a71423737bb051e3c7fce0d7\" returns successfully" Mar 17 18:03:34.862564 containerd[1738]: time="2025-03-17T18:03:34.862336989Z" level=info msg="StopPodSandbox for \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\"" Mar 17 18:03:34.862564 containerd[1738]: time="2025-03-17T18:03:34.862508591Z" level=info msg="TearDown network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" successfully" Mar 17 18:03:34.862564 containerd[1738]: time="2025-03-17T18:03:34.862526992Z" level=info msg="StopPodSandbox for \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" returns successfully" Mar 17 18:03:34.862945 containerd[1738]: time="2025-03-17T18:03:34.862853796Z" level=info msg="RemovePodSandbox for \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\"" Mar 17 18:03:34.862945 containerd[1738]: time="2025-03-17T18:03:34.862882197Z" level=info msg="Forcibly stopping sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\"" Mar 17 18:03:34.863069 containerd[1738]: time="2025-03-17T18:03:34.862969998Z" level=info msg="TearDown network for sandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" successfully" Mar 17 18:03:34.872669 containerd[1738]: time="2025-03-17T18:03:34.872641836Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.872759 containerd[1738]: time="2025-03-17T18:03:34.872679936Z" level=info msg="RemovePodSandbox \"ab93b9076c4af1638a76488a92f3fcd0e67b92c7819004edbe797291afaf9efb\" returns successfully" Mar 17 18:03:34.873044 containerd[1738]: time="2025-03-17T18:03:34.872970440Z" level=info msg="StopPodSandbox for \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\"" Mar 17 18:03:34.873123 containerd[1738]: time="2025-03-17T18:03:34.873060642Z" level=info msg="TearDown network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\" successfully" Mar 17 18:03:34.873123 containerd[1738]: time="2025-03-17T18:03:34.873074442Z" level=info msg="StopPodSandbox for \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\" returns successfully" Mar 17 18:03:34.873449 containerd[1738]: time="2025-03-17T18:03:34.873423847Z" level=info msg="RemovePodSandbox for \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\"" Mar 17 18:03:34.873525 containerd[1738]: time="2025-03-17T18:03:34.873450147Z" level=info msg="Forcibly stopping sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\"" Mar 17 18:03:34.873572 containerd[1738]: time="2025-03-17T18:03:34.873535548Z" level=info msg="TearDown network for sandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\" successfully" Mar 17 18:03:34.883254 containerd[1738]: time="2025-03-17T18:03:34.883218586Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 18:03:34.883329 containerd[1738]: time="2025-03-17T18:03:34.883258887Z" level=info msg="RemovePodSandbox \"49f6efdac0e92062f5fc490dfa3ac2bb3df3ed7c4b8cb378383e89750776677a\" returns successfully" Mar 17 18:03:35.706361 kubelet[2614]: E0317 18:03:35.706291 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:36.707459 kubelet[2614]: E0317 18:03:36.707399 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:37.708485 kubelet[2614]: E0317 18:03:37.708391 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:38.709073 kubelet[2614]: E0317 18:03:38.708971 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:39.710199 kubelet[2614]: E0317 18:03:39.710095 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:40.711230 kubelet[2614]: E0317 18:03:40.711156 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:41.712354 kubelet[2614]: E0317 18:03:41.712265 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:42.712911 kubelet[2614]: E0317 18:03:42.712847 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:43.465631 systemd[1]: Created slice kubepods-besteffort-podaf78577f_32ef_44a4_a83c_54fd6e7249dc.slice - libcontainer container kubepods-besteffort-podaf78577f_32ef_44a4_a83c_54fd6e7249dc.slice. Mar 17 18:03:43.520165 kubelet[2614]: I0317 18:03:43.520056 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-55518c89-09d6-412e-8fb8-b555e07a5060\" (UniqueName: \"kubernetes.io/nfs/af78577f-32ef-44a4-a83c-54fd6e7249dc-pvc-55518c89-09d6-412e-8fb8-b555e07a5060\") pod \"test-pod-1\" (UID: \"af78577f-32ef-44a4-a83c-54fd6e7249dc\") " pod="default/test-pod-1" Mar 17 18:03:43.520165 kubelet[2614]: I0317 18:03:43.520108 2614 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gkmq\" (UniqueName: \"kubernetes.io/projected/af78577f-32ef-44a4-a83c-54fd6e7249dc-kube-api-access-9gkmq\") pod \"test-pod-1\" (UID: \"af78577f-32ef-44a4-a83c-54fd6e7249dc\") " pod="default/test-pod-1" Mar 17 18:03:43.713818 kubelet[2614]: E0317 18:03:43.713773 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:43.816318 kernel: FS-Cache: Loaded Mar 17 18:03:43.948200 kernel: RPC: Registered named UNIX socket transport module. Mar 17 18:03:43.948329 kernel: RPC: Registered udp transport module. Mar 17 18:03:43.948351 kernel: RPC: Registered tcp transport module. Mar 17 18:03:43.950893 kernel: RPC: Registered tcp-with-tls transport module. Mar 17 18:03:43.950939 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Mar 17 18:03:44.348442 kernel: NFS: Registering the id_resolver key type Mar 17 18:03:44.348564 kernel: Key type id_resolver registered Mar 17 18:03:44.348587 kernel: Key type id_legacy registered Mar 17 18:03:44.459814 nfsidmap[4592]: nss_getpwnam: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain '1.0-a-d9de89fbd8' Mar 17 18:03:44.472272 nfsidmap[4593]: nss_name_to_gid: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain '1.0-a-d9de89fbd8' Mar 17 18:03:44.668986 containerd[1738]: time="2025-03-17T18:03:44.668845889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:af78577f-32ef-44a4-a83c-54fd6e7249dc,Namespace:default,Attempt:0,}" Mar 17 18:03:44.714243 kubelet[2614]: E0317 18:03:44.714113 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:44.817859 systemd-networkd[1344]: cali5ec59c6bf6e: Link UP Mar 17 18:03:44.818128 systemd-networkd[1344]: cali5ec59c6bf6e: Gained carrier Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.736 [INFO][4594] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {10.200.4.11-k8s-test--pod--1-eth0 default af78577f-32ef-44a4-a83c-54fd6e7249dc 1710 0 2025-03-17 18:03:14 +0000 UTC map[projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 10.200.4.11 test-pod-1 eth0 default [] [] [kns.default ksa.default.default] cali5ec59c6bf6e [] []}} ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.736 [INFO][4594] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-eth0" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.759 [INFO][4606] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" HandleID="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Workload="10.200.4.11-k8s-test--pod--1-eth0" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.773 [INFO][4606] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" HandleID="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Workload="10.200.4.11-k8s-test--pod--1-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000292a50), Attrs:map[string]string{"namespace":"default", "node":"10.200.4.11", "pod":"test-pod-1", "timestamp":"2025-03-17 18:03:44.759371181 +0000 UTC"}, Hostname:"10.200.4.11", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.773 [INFO][4606] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.773 [INFO][4606] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.773 [INFO][4606] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '10.200.4.11' Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.776 [INFO][4606] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.780 [INFO][4606] ipam/ipam.go 372: Looking up existing affinities for host host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.785 [INFO][4606] ipam/ipam.go 489: Trying affinity for 192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.787 [INFO][4606] ipam/ipam.go 155: Attempting to load block cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.790 [INFO][4606] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.11.192/26 host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.790 [INFO][4606] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.11.192/26 handle="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.793 [INFO][4606] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.798 [INFO][4606] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.11.192/26 handle="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.812 [INFO][4606] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.11.196/26] block=192.168.11.192/26 handle="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.812 [INFO][4606] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.11.196/26] handle="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" host="10.200.4.11" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.812 [INFO][4606] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.812 [INFO][4606] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.11.196/26] IPv6=[] ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" HandleID="k8s-pod-network.79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Workload="10.200.4.11-k8s-test--pod--1-eth0" Mar 17 18:03:44.829674 containerd[1738]: 2025-03-17 18:03:44.813 [INFO][4594] cni-plugin/k8s.go 386: Populated endpoint ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"af78577f-32ef-44a4-a83c-54fd6e7249dc", ResourceVersion:"1710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 3, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.11.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:44.830405 containerd[1738]: 2025-03-17 18:03:44.813 [INFO][4594] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.11.196/32] ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-eth0" Mar 17 18:03:44.830405 containerd[1738]: 2025-03-17 18:03:44.813 [INFO][4594] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ec59c6bf6e ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-eth0" Mar 17 18:03:44.830405 containerd[1738]: 2025-03-17 18:03:44.817 [INFO][4594] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-eth0" Mar 17 18:03:44.830405 containerd[1738]: 2025-03-17 18:03:44.817 [INFO][4594] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"10.200.4.11-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"af78577f-32ef-44a4-a83c-54fd6e7249dc", ResourceVersion:"1710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 3, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"10.200.4.11", ContainerID:"79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.11.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"d2:9e:91:a1:20:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:03:44.830405 containerd[1738]: 2025-03-17 18:03:44.828 [INFO][4594] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="10.200.4.11-k8s-test--pod--1-eth0" Mar 17 18:03:44.868310 containerd[1738]: time="2025-03-17T18:03:44.866887415Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:03:44.868713 containerd[1738]: time="2025-03-17T18:03:44.868452337Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:03:44.868713 containerd[1738]: time="2025-03-17T18:03:44.868577239Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:44.868847 containerd[1738]: time="2025-03-17T18:03:44.868770042Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:03:44.896347 systemd[1]: Started cri-containerd-79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c.scope - libcontainer container 79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c. Mar 17 18:03:44.934227 containerd[1738]: time="2025-03-17T18:03:44.934080374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:af78577f-32ef-44a4-a83c-54fd6e7249dc,Namespace:default,Attempt:0,} returns sandbox id \"79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c\"" Mar 17 18:03:44.935777 containerd[1738]: time="2025-03-17T18:03:44.935749698Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Mar 17 18:03:45.357811 containerd[1738]: time="2025-03-17T18:03:45.357762719Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 18:03:45.360409 containerd[1738]: time="2025-03-17T18:03:45.360348056Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=61" Mar 17 18:03:45.362780 containerd[1738]: time="2025-03-17T18:03:45.362747091Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:d25119ebd2aadc346788ac84ae0c5b1b018c687dcfd3167bb27e341f8b5caeee\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:b927c62cc716b99bce51774b46a63feb63f5414c6f985fb80cacd1933bbd0e06\", size \"73060009\" in 426.889791ms" Mar 17 18:03:45.362780 containerd[1738]: time="2025-03-17T18:03:45.362779091Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:d25119ebd2aadc346788ac84ae0c5b1b018c687dcfd3167bb27e341f8b5caeee\"" Mar 17 18:03:45.364782 containerd[1738]: time="2025-03-17T18:03:45.364758819Z" level=info msg="CreateContainer within sandbox \"79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c\" for container &ContainerMetadata{Name:test,Attempt:0,}" Mar 17 18:03:45.396717 containerd[1738]: time="2025-03-17T18:03:45.396613974Z" level=info msg="CreateContainer within sandbox \"79c7aaf22304c801195cccfc8a5ab4204e9b3fd42d853de4cf4ad48ebc8d109c\" for &ContainerMetadata{Name:test,Attempt:0,} returns container id \"32236fbcf376b7295580638084b2050f036331d6c1cec0b0d7f00433d39ce7f4\"" Mar 17 18:03:45.397624 containerd[1738]: time="2025-03-17T18:03:45.397481086Z" level=info msg="StartContainer for \"32236fbcf376b7295580638084b2050f036331d6c1cec0b0d7f00433d39ce7f4\"" Mar 17 18:03:45.427346 systemd[1]: Started cri-containerd-32236fbcf376b7295580638084b2050f036331d6c1cec0b0d7f00433d39ce7f4.scope - libcontainer container 32236fbcf376b7295580638084b2050f036331d6c1cec0b0d7f00433d39ce7f4. Mar 17 18:03:45.452820 containerd[1738]: time="2025-03-17T18:03:45.452712474Z" level=info msg="StartContainer for \"32236fbcf376b7295580638084b2050f036331d6c1cec0b0d7f00433d39ce7f4\" returns successfully" Mar 17 18:03:45.714960 kubelet[2614]: E0317 18:03:45.714707 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:46.062419 kubelet[2614]: I0317 18:03:46.062359 2614 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="default/test-pod-1" podStartSLOduration=31.634003161 podStartE2EDuration="32.062345473s" podCreationTimestamp="2025-03-17 18:03:14 +0000 UTC" firstStartedPulling="2025-03-17 18:03:44.935098288 +0000 UTC m=+70.984331263" lastFinishedPulling="2025-03-17 18:03:45.3634407 +0000 UTC m=+71.412673575" observedRunningTime="2025-03-17 18:03:46.062061469 +0000 UTC m=+72.111294344" watchObservedRunningTime="2025-03-17 18:03:46.062345473 +0000 UTC m=+72.111578448" Mar 17 18:03:46.674590 systemd-networkd[1344]: cali5ec59c6bf6e: Gained IPv6LL Mar 17 18:03:46.715712 kubelet[2614]: E0317 18:03:46.715667 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:47.716184 kubelet[2614]: E0317 18:03:47.716086 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:48.716905 kubelet[2614]: E0317 18:03:48.716864 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:49.718159 kubelet[2614]: E0317 18:03:49.718058 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:50.718601 kubelet[2614]: E0317 18:03:50.718484 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:51.718991 kubelet[2614]: E0317 18:03:51.718901 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:52.719966 kubelet[2614]: E0317 18:03:52.719904 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Mar 17 18:03:53.720442 kubelet[2614]: E0317 18:03:53.720379 2614 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests"