Feb 13 16:17:35.128264 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 13:54:58 -00 2025 Feb 13 16:17:35.128311 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=cd73eba291b8356dfc2c39f651cabef9206685f772c8949188fd366788d672c2 Feb 13 16:17:35.128331 kernel: BIOS-provided physical RAM map: Feb 13 16:17:35.128340 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Feb 13 16:17:35.128349 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Feb 13 16:17:35.128363 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Feb 13 16:17:35.128381 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable Feb 13 16:17:35.128396 kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved Feb 13 16:17:35.128407 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Feb 13 16:17:35.128423 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Feb 13 16:17:35.128433 kernel: NX (Execute Disable) protection: active Feb 13 16:17:35.128445 kernel: APIC: Static calls initialized Feb 13 16:17:35.128465 kernel: SMBIOS 2.8 present. Feb 13 16:17:35.128473 kernel: DMI: DigitalOcean Droplet/Droplet, BIOS 20171212 12/12/2017 Feb 13 16:17:35.128481 kernel: Hypervisor detected: KVM Feb 13 16:17:35.128491 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Feb 13 16:17:35.128503 kernel: kvm-clock: using sched offset of 3962035336 cycles Feb 13 16:17:35.128511 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Feb 13 16:17:35.128518 kernel: tsc: Detected 1995.305 MHz processor Feb 13 16:17:35.128525 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 16:17:35.128533 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 16:17:35.128540 kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000 Feb 13 16:17:35.128547 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Feb 13 16:17:35.128555 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 16:17:35.128565 kernel: ACPI: Early table checksum verification disabled Feb 13 16:17:35.128572 kernel: ACPI: RSDP 0x00000000000F5950 000014 (v00 BOCHS ) Feb 13 16:17:35.128579 kernel: ACPI: RSDT 0x000000007FFE1986 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 16:17:35.128586 kernel: ACPI: FACP 0x000000007FFE176A 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 16:17:35.128595 kernel: ACPI: DSDT 0x000000007FFE0040 00172A (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 16:17:35.128610 kernel: ACPI: FACS 0x000000007FFE0000 000040 Feb 13 16:17:35.128645 kernel: ACPI: APIC 0x000000007FFE17DE 000080 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 16:17:35.128657 kernel: ACPI: HPET 0x000000007FFE185E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 16:17:35.128669 kernel: ACPI: SRAT 0x000000007FFE1896 0000C8 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 16:17:35.128680 kernel: ACPI: WAET 0x000000007FFE195E 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Feb 13 16:17:35.128687 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe176a-0x7ffe17dd] Feb 13 16:17:35.128694 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffe0040-0x7ffe1769] Feb 13 16:17:35.128701 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffe0000-0x7ffe003f] Feb 13 16:17:35.128708 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe17de-0x7ffe185d] Feb 13 16:17:35.128715 kernel: ACPI: Reserving HPET table memory at [mem 0x7ffe185e-0x7ffe1895] Feb 13 16:17:35.128722 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe1896-0x7ffe195d] Feb 13 16:17:35.128735 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe195e-0x7ffe1985] Feb 13 16:17:35.128743 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 16:17:35.128750 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Feb 13 16:17:35.128758 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 16:17:35.128765 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 16:17:35.128777 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdafff] -> [mem 0x00000000-0x7ffdafff] Feb 13 16:17:35.128785 kernel: NODE_DATA(0) allocated [mem 0x7ffd5000-0x7ffdafff] Feb 13 16:17:35.128795 kernel: Zone ranges: Feb 13 16:17:35.128803 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 16:17:35.128810 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdafff] Feb 13 16:17:35.128817 kernel: Normal empty Feb 13 16:17:35.128825 kernel: Movable zone start for each node Feb 13 16:17:35.128832 kernel: Early memory node ranges Feb 13 16:17:35.128840 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Feb 13 16:17:35.128847 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdafff] Feb 13 16:17:35.128854 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdafff] Feb 13 16:17:35.128864 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 16:17:35.128871 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Feb 13 16:17:35.128883 kernel: On node 0, zone DMA32: 37 pages in unavailable ranges Feb 13 16:17:35.128890 kernel: ACPI: PM-Timer IO Port: 0x608 Feb 13 16:17:35.128898 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Feb 13 16:17:35.128905 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Feb 13 16:17:35.128913 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Feb 13 16:17:35.128920 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Feb 13 16:17:35.128937 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Feb 13 16:17:35.128947 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Feb 13 16:17:35.128955 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Feb 13 16:17:35.128962 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 16:17:35.128969 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Feb 13 16:17:35.128977 kernel: TSC deadline timer available Feb 13 16:17:35.128992 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Feb 13 16:17:35.128999 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Feb 13 16:17:35.129007 kernel: [mem 0x80000000-0xfeffbfff] available for PCI devices Feb 13 16:17:35.129023 kernel: Booting paravirtualized kernel on KVM Feb 13 16:17:35.129035 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 16:17:35.129051 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Feb 13 16:17:35.129064 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Feb 13 16:17:35.129077 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Feb 13 16:17:35.129090 kernel: pcpu-alloc: [0] 0 1 Feb 13 16:17:35.129103 kernel: kvm-guest: PV spinlocks disabled, no host support Feb 13 16:17:35.129118 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=cd73eba291b8356dfc2c39f651cabef9206685f772c8949188fd366788d672c2 Feb 13 16:17:35.129131 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 16:17:35.129143 kernel: random: crng init done Feb 13 16:17:35.129155 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 16:17:35.129163 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 16:17:35.129172 kernel: Fallback order for Node 0: 0 Feb 13 16:17:35.129180 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515803 Feb 13 16:17:35.129189 kernel: Policy zone: DMA32 Feb 13 16:17:35.129197 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 16:17:35.129211 kernel: Memory: 1971204K/2096612K available (12288K kernel code, 2299K rwdata, 22736K rodata, 42976K init, 2216K bss, 125148K reserved, 0K cma-reserved) Feb 13 16:17:35.129245 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Feb 13 16:17:35.129261 kernel: Kernel/User page tables isolation: enabled Feb 13 16:17:35.129276 kernel: ftrace: allocating 37920 entries in 149 pages Feb 13 16:17:35.129289 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 16:17:35.129302 kernel: Dynamic Preempt: voluntary Feb 13 16:17:35.129313 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 16:17:35.129327 kernel: rcu: RCU event tracing is enabled. Feb 13 16:17:35.129339 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Feb 13 16:17:35.129364 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 16:17:35.129372 kernel: Rude variant of Tasks RCU enabled. Feb 13 16:17:35.129380 kernel: Tracing variant of Tasks RCU enabled. Feb 13 16:17:35.129392 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 16:17:35.129399 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Feb 13 16:17:35.129407 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Feb 13 16:17:35.129414 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Feb 13 16:17:35.129427 kernel: Console: colour VGA+ 80x25 Feb 13 16:17:35.129437 kernel: printk: console [tty0] enabled Feb 13 16:17:35.129450 kernel: printk: console [ttyS0] enabled Feb 13 16:17:35.129463 kernel: ACPI: Core revision 20230628 Feb 13 16:17:35.129477 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Feb 13 16:17:35.129493 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 16:17:35.129505 kernel: x2apic enabled Feb 13 16:17:35.129517 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 16:17:35.129529 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 16:17:35.129543 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3985b6280e7, max_idle_ns: 881590416988 ns Feb 13 16:17:35.129557 kernel: Calibrating delay loop (skipped) preset value.. 3990.61 BogoMIPS (lpj=1995305) Feb 13 16:17:35.129572 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Feb 13 16:17:35.129585 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Feb 13 16:17:35.129613 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 16:17:35.129645 kernel: Spectre V2 : Mitigation: Retpolines Feb 13 16:17:35.129660 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 16:17:35.129678 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Feb 13 16:17:35.129691 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Feb 13 16:17:35.129702 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 16:17:35.129717 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 16:17:35.129731 kernel: MDS: Mitigation: Clear CPU buffers Feb 13 16:17:35.129745 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 16:17:35.129769 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 16:17:35.129785 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 16:17:35.129798 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 16:17:35.129812 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 16:17:35.129825 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Feb 13 16:17:35.129839 kernel: Freeing SMP alternatives memory: 32K Feb 13 16:17:35.129853 kernel: pid_max: default: 32768 minimum: 301 Feb 13 16:17:35.129868 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 16:17:35.129886 kernel: landlock: Up and running. Feb 13 16:17:35.129899 kernel: SELinux: Initializing. Feb 13 16:17:35.129914 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 16:17:35.129931 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 16:17:35.129946 kernel: smpboot: CPU0: Intel DO-Regular (family: 0x6, model: 0x4f, stepping: 0x1) Feb 13 16:17:35.129960 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 16:17:35.129975 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 16:17:35.129991 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Feb 13 16:17:35.130005 kernel: Performance Events: unsupported p6 CPU model 79 no PMU driver, software events only. Feb 13 16:17:35.130023 kernel: signal: max sigframe size: 1776 Feb 13 16:17:35.130034 kernel: rcu: Hierarchical SRCU implementation. Feb 13 16:17:35.130047 kernel: rcu: Max phase no-delay instances is 400. Feb 13 16:17:35.130061 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 16:17:35.130073 kernel: smp: Bringing up secondary CPUs ... Feb 13 16:17:35.130086 kernel: smpboot: x86: Booting SMP configuration: Feb 13 16:17:35.130103 kernel: .... node #0, CPUs: #1 Feb 13 16:17:35.130117 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 16:17:35.132314 kernel: smpboot: Max logical packages: 1 Feb 13 16:17:35.132339 kernel: smpboot: Total of 2 processors activated (7981.22 BogoMIPS) Feb 13 16:17:35.132354 kernel: devtmpfs: initialized Feb 13 16:17:35.132368 kernel: x86/mm: Memory block size: 128MB Feb 13 16:17:35.132383 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 16:17:35.132398 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Feb 13 16:17:35.132412 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 16:17:35.132427 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 16:17:35.132440 kernel: audit: initializing netlink subsys (disabled) Feb 13 16:17:35.132454 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 16:17:35.132475 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 16:17:35.132488 kernel: audit: type=2000 audit(1739463452.732:1): state=initialized audit_enabled=0 res=1 Feb 13 16:17:35.132501 kernel: cpuidle: using governor menu Feb 13 16:17:35.132514 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 16:17:35.132527 kernel: dca service started, version 1.12.1 Feb 13 16:17:35.132539 kernel: PCI: Using configuration type 1 for base access Feb 13 16:17:35.132551 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 16:17:35.132565 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 16:17:35.132581 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 16:17:35.132598 kernel: ACPI: Added _OSI(Module Device) Feb 13 16:17:35.132612 kernel: ACPI: Added _OSI(Processor Device) Feb 13 16:17:35.132624 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 16:17:35.132638 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 16:17:35.132654 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 16:17:35.132667 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 16:17:35.132680 kernel: ACPI: Interpreter enabled Feb 13 16:17:35.132694 kernel: ACPI: PM: (supports S0 S5) Feb 13 16:17:35.132710 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 16:17:35.132730 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 16:17:35.132744 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 16:17:35.132758 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Feb 13 16:17:35.132770 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Feb 13 16:17:35.133063 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Feb 13 16:17:35.133192 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Feb 13 16:17:35.133317 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Feb 13 16:17:35.133335 kernel: acpiphp: Slot [3] registered Feb 13 16:17:35.133344 kernel: acpiphp: Slot [4] registered Feb 13 16:17:35.133352 kernel: acpiphp: Slot [5] registered Feb 13 16:17:35.133361 kernel: acpiphp: Slot [6] registered Feb 13 16:17:35.133369 kernel: acpiphp: Slot [7] registered Feb 13 16:17:35.133377 kernel: acpiphp: Slot [8] registered Feb 13 16:17:35.133385 kernel: acpiphp: Slot [9] registered Feb 13 16:17:35.133393 kernel: acpiphp: Slot [10] registered Feb 13 16:17:35.133402 kernel: acpiphp: Slot [11] registered Feb 13 16:17:35.133414 kernel: acpiphp: Slot [12] registered Feb 13 16:17:35.133422 kernel: acpiphp: Slot [13] registered Feb 13 16:17:35.133430 kernel: acpiphp: Slot [14] registered Feb 13 16:17:35.133438 kernel: acpiphp: Slot [15] registered Feb 13 16:17:35.133446 kernel: acpiphp: Slot [16] registered Feb 13 16:17:35.133454 kernel: acpiphp: Slot [17] registered Feb 13 16:17:35.133462 kernel: acpiphp: Slot [18] registered Feb 13 16:17:35.133471 kernel: acpiphp: Slot [19] registered Feb 13 16:17:35.133479 kernel: acpiphp: Slot [20] registered Feb 13 16:17:35.133487 kernel: acpiphp: Slot [21] registered Feb 13 16:17:35.133498 kernel: acpiphp: Slot [22] registered Feb 13 16:17:35.133506 kernel: acpiphp: Slot [23] registered Feb 13 16:17:35.133514 kernel: acpiphp: Slot [24] registered Feb 13 16:17:35.133527 kernel: acpiphp: Slot [25] registered Feb 13 16:17:35.133542 kernel: acpiphp: Slot [26] registered Feb 13 16:17:35.133555 kernel: acpiphp: Slot [27] registered Feb 13 16:17:35.133564 kernel: acpiphp: Slot [28] registered Feb 13 16:17:35.133572 kernel: acpiphp: Slot [29] registered Feb 13 16:17:35.133580 kernel: acpiphp: Slot [30] registered Feb 13 16:17:35.133589 kernel: acpiphp: Slot [31] registered Feb 13 16:17:35.133601 kernel: PCI host bridge to bus 0000:00 Feb 13 16:17:35.133724 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 16:17:35.133811 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Feb 13 16:17:35.133895 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 16:17:35.133978 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Feb 13 16:17:35.134069 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] Feb 13 16:17:35.134169 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Feb 13 16:17:35.134316 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Feb 13 16:17:35.134432 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Feb 13 16:17:35.134542 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Feb 13 16:17:35.134638 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc1e0-0xc1ef] Feb 13 16:17:35.134749 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 16:17:35.134888 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 16:17:35.135019 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 16:17:35.135123 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 16:17:35.137473 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Feb 13 16:17:35.137704 kernel: pci 0000:00:01.2: reg 0x20: [io 0xc180-0xc19f] Feb 13 16:17:35.137862 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Feb 13 16:17:35.137961 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Feb 13 16:17:35.138079 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Feb 13 16:17:35.138194 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Feb 13 16:17:35.138315 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Feb 13 16:17:35.138428 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Feb 13 16:17:35.138526 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfebf0000-0xfebf0fff] Feb 13 16:17:35.138628 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfebe0000-0xfebeffff pref] Feb 13 16:17:35.138771 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 16:17:35.138934 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Feb 13 16:17:35.139035 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc1a0-0xc1bf] Feb 13 16:17:35.139129 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfebf1000-0xfebf1fff] Feb 13 16:17:35.139221 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Feb 13 16:17:35.141039 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 Feb 13 16:17:35.141144 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc1c0-0xc1df] Feb 13 16:17:35.141316 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfebf2000-0xfebf2fff] Feb 13 16:17:35.141472 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Feb 13 16:17:35.141641 kernel: pci 0000:00:05.0: [1af4:1004] type 00 class 0x010000 Feb 13 16:17:35.141845 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc100-0xc13f] Feb 13 16:17:35.142011 kernel: pci 0000:00:05.0: reg 0x14: [mem 0xfebf3000-0xfebf3fff] Feb 13 16:17:35.142187 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Feb 13 16:17:35.142398 kernel: pci 0000:00:06.0: [1af4:1001] type 00 class 0x010000 Feb 13 16:17:35.142559 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc000-0xc07f] Feb 13 16:17:35.142751 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfebf4000-0xfebf4fff] Feb 13 16:17:35.142903 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Feb 13 16:17:35.143063 kernel: pci 0000:00:07.0: [1af4:1001] type 00 class 0x010000 Feb 13 16:17:35.143215 kernel: pci 0000:00:07.0: reg 0x10: [io 0xc080-0xc0ff] Feb 13 16:17:35.144044 kernel: pci 0000:00:07.0: reg 0x14: [mem 0xfebf5000-0xfebf5fff] Feb 13 16:17:35.144208 kernel: pci 0000:00:07.0: reg 0x20: [mem 0xfe814000-0xfe817fff 64bit pref] Feb 13 16:17:35.144436 kernel: pci 0000:00:08.0: [1af4:1002] type 00 class 0x00ff00 Feb 13 16:17:35.144607 kernel: pci 0000:00:08.0: reg 0x10: [io 0xc140-0xc17f] Feb 13 16:17:35.144765 kernel: pci 0000:00:08.0: reg 0x20: [mem 0xfe818000-0xfe81bfff 64bit pref] Feb 13 16:17:35.144786 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Feb 13 16:17:35.144804 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Feb 13 16:17:35.144816 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 16:17:35.144830 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Feb 13 16:17:35.144844 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Feb 13 16:17:35.144867 kernel: iommu: Default domain type: Translated Feb 13 16:17:35.144880 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 16:17:35.144895 kernel: PCI: Using ACPI for IRQ routing Feb 13 16:17:35.144910 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 16:17:35.144925 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Feb 13 16:17:35.144940 kernel: e820: reserve RAM buffer [mem 0x7ffdb000-0x7fffffff] Feb 13 16:17:35.145122 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Feb 13 16:17:35.145298 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Feb 13 16:17:35.145404 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 16:17:35.145427 kernel: vgaarb: loaded Feb 13 16:17:35.145447 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Feb 13 16:17:35.145465 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Feb 13 16:17:35.145484 kernel: clocksource: Switched to clocksource kvm-clock Feb 13 16:17:35.145499 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 16:17:35.145514 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 16:17:35.145527 kernel: pnp: PnP ACPI init Feb 13 16:17:35.145540 kernel: pnp: PnP ACPI: found 4 devices Feb 13 16:17:35.145554 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 16:17:35.145573 kernel: NET: Registered PF_INET protocol family Feb 13 16:17:35.145589 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 16:17:35.145603 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 16:17:35.145617 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 16:17:35.145630 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 16:17:35.145644 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 16:17:35.145657 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 16:17:35.145670 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 16:17:35.145684 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 16:17:35.145701 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 16:17:35.145714 kernel: NET: Registered PF_XDP protocol family Feb 13 16:17:35.145880 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Feb 13 16:17:35.146061 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Feb 13 16:17:35.146784 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Feb 13 16:17:35.146952 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Feb 13 16:17:35.147101 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x17fffffff window] Feb 13 16:17:35.147467 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Feb 13 16:17:35.147668 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 16:17:35.147690 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Feb 13 16:17:35.147848 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x7a0 took 45150 usecs Feb 13 16:17:35.147869 kernel: PCI: CLS 0 bytes, default 64 Feb 13 16:17:35.147884 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 16:17:35.147897 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x3985b6280e7, max_idle_ns: 881590416988 ns Feb 13 16:17:35.147911 kernel: Initialise system trusted keyrings Feb 13 16:17:35.147925 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 16:17:35.147947 kernel: Key type asymmetric registered Feb 13 16:17:35.147962 kernel: Asymmetric key parser 'x509' registered Feb 13 16:17:35.147979 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 16:17:35.147995 kernel: io scheduler mq-deadline registered Feb 13 16:17:35.148010 kernel: io scheduler kyber registered Feb 13 16:17:35.148026 kernel: io scheduler bfq registered Feb 13 16:17:35.148042 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 16:17:35.148058 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Feb 13 16:17:35.148074 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Feb 13 16:17:35.148088 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Feb 13 16:17:35.148106 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 16:17:35.148121 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 16:17:35.148136 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Feb 13 16:17:35.148173 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 16:17:35.148189 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 16:17:35.153440 kernel: rtc_cmos 00:03: RTC can wake from S4 Feb 13 16:17:35.153511 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 16:17:35.153693 kernel: rtc_cmos 00:03: registered as rtc0 Feb 13 16:17:35.154214 kernel: rtc_cmos 00:03: setting system clock to 2025-02-13T16:17:34 UTC (1739463454) Feb 13 16:17:35.154469 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Feb 13 16:17:35.154494 kernel: intel_pstate: CPU model not supported Feb 13 16:17:35.154510 kernel: NET: Registered PF_INET6 protocol family Feb 13 16:17:35.154523 kernel: Segment Routing with IPv6 Feb 13 16:17:35.154535 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 16:17:35.154549 kernel: NET: Registered PF_PACKET protocol family Feb 13 16:17:35.154563 kernel: Key type dns_resolver registered Feb 13 16:17:35.154586 kernel: IPI shorthand broadcast: enabled Feb 13 16:17:35.154600 kernel: sched_clock: Marking stable (1544048523, 173804308)->(1856722972, -138870141) Feb 13 16:17:35.154613 kernel: registered taskstats version 1 Feb 13 16:17:35.154627 kernel: Loading compiled-in X.509 certificates Feb 13 16:17:35.154641 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 9ec780e1db69d46be90bbba73ae62b0106e27ae0' Feb 13 16:17:35.154656 kernel: Key type .fscrypt registered Feb 13 16:17:35.154673 kernel: Key type fscrypt-provisioning registered Feb 13 16:17:35.154687 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 16:17:35.154700 kernel: ima: Allocated hash algorithm: sha1 Feb 13 16:17:35.154721 kernel: ima: No architecture policies found Feb 13 16:17:35.154821 kernel: clk: Disabling unused clocks Feb 13 16:17:35.154836 kernel: Freeing unused kernel image (initmem) memory: 42976K Feb 13 16:17:35.157260 kernel: Write protecting the kernel read-only data: 36864k Feb 13 16:17:35.157350 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Feb 13 16:17:35.157370 kernel: Run /init as init process Feb 13 16:17:35.157385 kernel: with arguments: Feb 13 16:17:35.157401 kernel: /init Feb 13 16:17:35.157416 kernel: with environment: Feb 13 16:17:35.157458 kernel: HOME=/ Feb 13 16:17:35.157474 kernel: TERM=linux Feb 13 16:17:35.157488 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 16:17:35.157514 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 16:17:35.157532 systemd[1]: Detected virtualization kvm. Feb 13 16:17:35.157549 systemd[1]: Detected architecture x86-64. Feb 13 16:17:35.157564 systemd[1]: Running in initrd. Feb 13 16:17:35.157582 systemd[1]: No hostname configured, using default hostname. Feb 13 16:17:35.157602 systemd[1]: Hostname set to . Feb 13 16:17:35.157617 systemd[1]: Initializing machine ID from VM UUID. Feb 13 16:17:35.157632 systemd[1]: Queued start job for default target initrd.target. Feb 13 16:17:35.157647 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 16:17:35.157662 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 16:17:35.157680 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 16:17:35.157695 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 16:17:35.157709 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 16:17:35.157729 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 16:17:35.157748 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 16:17:35.157765 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 16:17:35.157779 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 16:17:35.157794 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 16:17:35.157808 systemd[1]: Reached target paths.target - Path Units. Feb 13 16:17:35.157829 systemd[1]: Reached target slices.target - Slice Units. Feb 13 16:17:35.157846 systemd[1]: Reached target swap.target - Swaps. Feb 13 16:17:35.157867 systemd[1]: Reached target timers.target - Timer Units. Feb 13 16:17:35.157885 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 16:17:35.157902 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 16:17:35.157918 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 16:17:35.157937 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Feb 13 16:17:35.157956 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 16:17:35.157971 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 16:17:35.157991 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 16:17:35.158009 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 16:17:35.158027 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 16:17:35.158043 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 16:17:35.158059 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 16:17:35.158078 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 16:17:35.158095 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 16:17:35.158112 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 16:17:35.158198 systemd-journald[184]: Collecting audit messages is disabled. Feb 13 16:17:35.158265 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:17:35.158280 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 16:17:35.158298 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 16:17:35.158314 systemd-journald[184]: Journal started Feb 13 16:17:35.158348 systemd-journald[184]: Runtime Journal (/run/log/journal/f6a9ff72ed1f4412ab1bbeae1df04298) is 4.9M, max 39.3M, 34.4M free. Feb 13 16:17:35.159839 systemd-modules-load[185]: Inserted module 'overlay' Feb 13 16:17:35.166355 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 16:17:35.168520 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 16:17:35.186572 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 16:17:35.238629 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 16:17:35.238674 kernel: Bridge firewalling registered Feb 13 16:17:35.212984 systemd-modules-load[185]: Inserted module 'br_netfilter' Feb 13 16:17:35.245664 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 16:17:35.253396 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 16:17:35.257716 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:17:35.259407 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 16:17:35.266581 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 16:17:35.273778 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 16:17:35.286493 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 16:17:35.291437 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 16:17:35.297408 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 16:17:35.314541 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 16:17:35.321308 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:17:35.324582 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 16:17:35.333365 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 16:17:35.367834 dracut-cmdline[218]: dracut-dracut-053 Feb 13 16:17:35.378730 systemd-resolved[215]: Positive Trust Anchors: Feb 13 16:17:35.380762 dracut-cmdline[218]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=cd73eba291b8356dfc2c39f651cabef9206685f772c8949188fd366788d672c2 Feb 13 16:17:35.378753 systemd-resolved[215]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 16:17:35.378812 systemd-resolved[215]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 16:17:35.384260 systemd-resolved[215]: Defaulting to hostname 'linux'. Feb 13 16:17:35.388539 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 16:17:35.389795 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 16:17:35.539368 kernel: SCSI subsystem initialized Feb 13 16:17:35.554286 kernel: Loading iSCSI transport class v2.0-870. Feb 13 16:17:35.577197 kernel: iscsi: registered transport (tcp) Feb 13 16:17:35.622462 kernel: iscsi: registered transport (qla4xxx) Feb 13 16:17:35.622564 kernel: QLogic iSCSI HBA Driver Feb 13 16:17:35.717399 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 16:17:35.724552 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 16:17:35.773820 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 16:17:35.773928 kernel: device-mapper: uevent: version 1.0.3 Feb 13 16:17:35.773970 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 16:17:35.841332 kernel: raid6: avx2x4 gen() 18029 MB/s Feb 13 16:17:35.859779 kernel: raid6: avx2x2 gen() 18615 MB/s Feb 13 16:17:35.878412 kernel: raid6: avx2x1 gen() 13524 MB/s Feb 13 16:17:35.878513 kernel: raid6: using algorithm avx2x2 gen() 18615 MB/s Feb 13 16:17:35.897608 kernel: raid6: .... xor() 10874 MB/s, rmw enabled Feb 13 16:17:35.897709 kernel: raid6: using avx2x2 recovery algorithm Feb 13 16:17:35.938287 kernel: xor: automatically using best checksumming function avx Feb 13 16:17:36.163319 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 16:17:36.193201 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 16:17:36.205578 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 16:17:36.234005 systemd-udevd[401]: Using default interface naming scheme 'v255'. Feb 13 16:17:36.241521 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 16:17:36.249899 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 16:17:36.289480 dracut-pre-trigger[411]: rd.md=0: removing MD RAID activation Feb 13 16:17:36.360422 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 16:17:36.369332 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 16:17:36.457138 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 16:17:36.464046 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 16:17:36.501985 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 16:17:36.505080 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 16:17:36.505932 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 16:17:36.508973 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 16:17:36.517698 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 16:17:36.556647 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 16:17:36.615919 kernel: scsi host0: Virtio SCSI HBA Feb 13 16:17:36.624278 kernel: virtio_blk virtio4: 1/0/0 default/read/poll queues Feb 13 16:17:36.753991 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 16:17:36.754028 kernel: virtio_blk virtio4: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Feb 13 16:17:36.754270 kernel: libata version 3.00 loaded. Feb 13 16:17:36.754291 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Feb 13 16:17:36.754320 kernel: GPT:9289727 != 125829119 Feb 13 16:17:36.754338 kernel: GPT:Alternate GPT header not at the end of the disk. Feb 13 16:17:36.754359 kernel: GPT:9289727 != 125829119 Feb 13 16:17:36.754377 kernel: GPT: Use GNU Parted to correct GPT errors. Feb 13 16:17:36.754394 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 16:17:36.754410 kernel: ata_piix 0000:00:01.1: version 2.13 Feb 13 16:17:36.754620 kernel: scsi host1: ata_piix Feb 13 16:17:36.754815 kernel: scsi host2: ata_piix Feb 13 16:17:36.755039 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc1e0 irq 14 Feb 13 16:17:36.755060 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc1e8 irq 15 Feb 13 16:17:36.749871 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 16:17:36.768371 kernel: virtio_blk virtio5: 1/0/0 default/read/poll queues Feb 13 16:17:36.768582 kernel: virtio_blk virtio5: [vdb] 932 512-byte logical blocks (477 kB/466 KiB) Feb 13 16:17:36.749993 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:17:36.752014 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 16:17:36.752804 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 16:17:36.753340 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:17:36.754815 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:17:36.787201 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:17:36.795593 kernel: ACPI: bus type USB registered Feb 13 16:17:36.795630 kernel: usbcore: registered new interface driver usbfs Feb 13 16:17:36.795650 kernel: usbcore: registered new interface driver hub Feb 13 16:17:36.795666 kernel: usbcore: registered new device driver usb Feb 13 16:17:36.866838 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:17:36.876281 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 16:17:36.931324 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 16:17:36.933279 kernel: AES CTR mode by8 optimization enabled Feb 13 16:17:36.956867 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:17:37.004269 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (448) Feb 13 16:17:37.014274 kernel: BTRFS: device fsid 966d6124-9067-4089-b000-5e99065fe7e2 devid 1 transid 37 /dev/vda3 scanned by (udev-worker) (465) Feb 13 16:17:37.022388 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Feb 13 16:17:37.032042 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Feb 13 16:17:37.057477 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Feb 13 16:17:37.063941 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Feb 13 16:17:37.064457 kernel: uhci_hcd 0000:00:01.2: detected 2 ports Feb 13 16:17:37.064637 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c180 Feb 13 16:17:37.064779 kernel: hub 1-0:1.0: USB hub found Feb 13 16:17:37.064943 kernel: hub 1-0:1.0: 2 ports detected Feb 13 16:17:37.059165 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Feb 13 16:17:37.066018 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Feb 13 16:17:37.074884 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Feb 13 16:17:37.083104 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 16:17:37.101902 disk-uuid[551]: Primary Header is updated. Feb 13 16:17:37.101902 disk-uuid[551]: Secondary Entries is updated. Feb 13 16:17:37.101902 disk-uuid[551]: Secondary Header is updated. Feb 13 16:17:37.108253 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 16:17:37.115325 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 16:17:38.129483 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Feb 13 16:17:38.131894 disk-uuid[552]: The operation has completed successfully. Feb 13 16:17:38.218457 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 16:17:38.218625 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 16:17:38.250701 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 16:17:38.259927 sh[563]: Success Feb 13 16:17:38.284493 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 16:17:38.365582 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 16:17:38.380685 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 16:17:38.382421 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 16:17:38.424703 kernel: BTRFS info (device dm-0): first mount of filesystem 966d6124-9067-4089-b000-5e99065fe7e2 Feb 13 16:17:38.424789 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:17:38.426501 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 16:17:38.430193 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 16:17:38.431476 kernel: BTRFS info (device dm-0): using free space tree Feb 13 16:17:38.454994 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 16:17:38.456740 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Feb 13 16:17:38.470818 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 16:17:38.473435 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 16:17:38.498373 kernel: BTRFS info (device vda6): first mount of filesystem 83f602a1-06be-4b8b-b461-5e4f70db8da1 Feb 13 16:17:38.498489 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:17:38.498510 kernel: BTRFS info (device vda6): using free space tree Feb 13 16:17:38.511517 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 16:17:38.533139 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 16:17:38.536915 kernel: BTRFS info (device vda6): last unmount of filesystem 83f602a1-06be-4b8b-b461-5e4f70db8da1 Feb 13 16:17:38.550196 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 16:17:38.568327 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 16:17:38.703539 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 16:17:38.719662 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 16:17:38.758300 systemd-networkd[749]: lo: Link UP Feb 13 16:17:38.758315 systemd-networkd[749]: lo: Gained carrier Feb 13 16:17:38.762221 systemd-networkd[749]: Enumeration completed Feb 13 16:17:38.762462 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 16:17:38.762840 systemd-networkd[749]: eth0: found matching network '/usr/lib/systemd/network/yy-digitalocean.network', based on potentially unpredictable interface name. Feb 13 16:17:38.762846 systemd-networkd[749]: eth0: Configuring with /usr/lib/systemd/network/yy-digitalocean.network. Feb 13 16:17:38.763364 systemd[1]: Reached target network.target - Network. Feb 13 16:17:38.765836 systemd-networkd[749]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 16:17:38.765842 systemd-networkd[749]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Feb 13 16:17:38.770777 systemd-networkd[749]: eth0: Link UP Feb 13 16:17:38.770784 systemd-networkd[749]: eth0: Gained carrier Feb 13 16:17:38.770801 systemd-networkd[749]: eth0: found matching network '/usr/lib/systemd/network/yy-digitalocean.network', based on potentially unpredictable interface name. Feb 13 16:17:38.780952 systemd-networkd[749]: eth1: Link UP Feb 13 16:17:38.780958 systemd-networkd[749]: eth1: Gained carrier Feb 13 16:17:38.780978 systemd-networkd[749]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Feb 13 16:17:38.804388 systemd-networkd[749]: eth0: DHCPv4 address 64.227.101.255/20, gateway 64.227.96.1 acquired from 169.254.169.253 Feb 13 16:17:38.813349 systemd-networkd[749]: eth1: DHCPv4 address 10.124.0.4/20 acquired from 169.254.169.253 Feb 13 16:17:38.814423 ignition[659]: Ignition 2.20.0 Feb 13 16:17:38.814431 ignition[659]: Stage: fetch-offline Feb 13 16:17:38.814473 ignition[659]: no configs at "/usr/lib/ignition/base.d" Feb 13 16:17:38.819717 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 16:17:38.814483 ignition[659]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 16:17:38.814598 ignition[659]: parsed url from cmdline: "" Feb 13 16:17:38.814602 ignition[659]: no config URL provided Feb 13 16:17:38.814607 ignition[659]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 16:17:38.814617 ignition[659]: no config at "/usr/lib/ignition/user.ign" Feb 13 16:17:38.814624 ignition[659]: failed to fetch config: resource requires networking Feb 13 16:17:38.814908 ignition[659]: Ignition finished successfully Feb 13 16:17:38.830689 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Feb 13 16:17:38.874604 ignition[759]: Ignition 2.20.0 Feb 13 16:17:38.874623 ignition[759]: Stage: fetch Feb 13 16:17:38.874911 ignition[759]: no configs at "/usr/lib/ignition/base.d" Feb 13 16:17:38.874927 ignition[759]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 16:17:38.875087 ignition[759]: parsed url from cmdline: "" Feb 13 16:17:38.875093 ignition[759]: no config URL provided Feb 13 16:17:38.875102 ignition[759]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 16:17:38.875117 ignition[759]: no config at "/usr/lib/ignition/user.ign" Feb 13 16:17:38.875152 ignition[759]: GET http://169.254.169.254/metadata/v1/user-data: attempt #1 Feb 13 16:17:38.921464 ignition[759]: GET result: OK Feb 13 16:17:38.929978 unknown[759]: fetched base config from "system" Feb 13 16:17:38.921580 ignition[759]: parsing config with SHA512: 9bee39f21c7e89931ee27a64e54869d0dcb2b1896ce958524f5137ff38bafa2cebba585bc5993a7756563adb42e32470a98c8f5b7e5b55770d724ac247baec1b Feb 13 16:17:38.929992 unknown[759]: fetched base config from "system" Feb 13 16:17:38.930450 ignition[759]: fetch: fetch complete Feb 13 16:17:38.930004 unknown[759]: fetched user config from "digitalocean" Feb 13 16:17:38.930459 ignition[759]: fetch: fetch passed Feb 13 16:17:38.930537 ignition[759]: Ignition finished successfully Feb 13 16:17:38.938427 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Feb 13 16:17:38.955038 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 16:17:38.979204 ignition[765]: Ignition 2.20.0 Feb 13 16:17:38.979222 ignition[765]: Stage: kargs Feb 13 16:17:38.979669 ignition[765]: no configs at "/usr/lib/ignition/base.d" Feb 13 16:17:38.979687 ignition[765]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 16:17:38.981825 ignition[765]: kargs: kargs passed Feb 13 16:17:38.983794 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 16:17:38.981904 ignition[765]: Ignition finished successfully Feb 13 16:17:38.999651 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 16:17:39.030140 ignition[771]: Ignition 2.20.0 Feb 13 16:17:39.030171 ignition[771]: Stage: disks Feb 13 16:17:39.030615 ignition[771]: no configs at "/usr/lib/ignition/base.d" Feb 13 16:17:39.030634 ignition[771]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 16:17:39.033997 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 16:17:39.031808 ignition[771]: disks: disks passed Feb 13 16:17:39.031884 ignition[771]: Ignition finished successfully Feb 13 16:17:39.045389 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 16:17:39.047363 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 16:17:39.052409 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 16:17:39.053426 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 16:17:39.056720 systemd[1]: Reached target basic.target - Basic System. Feb 13 16:17:39.062591 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 16:17:39.112282 systemd-fsck[780]: ROOT: clean, 14/553520 files, 52654/553472 blocks Feb 13 16:17:39.117907 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 16:17:39.133601 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 16:17:39.345263 kernel: EXT4-fs (vda9): mounted filesystem 85ed0b0d-7f0f-4eeb-80d8-6213e9fcc55d r/w with ordered data mode. Quota mode: none. Feb 13 16:17:39.346790 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 16:17:39.348644 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 16:17:39.358066 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 16:17:39.370550 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 16:17:39.377593 systemd[1]: Starting flatcar-afterburn-network.service - Flatcar Afterburn network service... Feb 13 16:17:39.393705 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (788) Feb 13 16:17:39.393785 kernel: BTRFS info (device vda6): first mount of filesystem 83f602a1-06be-4b8b-b461-5e4f70db8da1 Feb 13 16:17:39.398713 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:17:39.398805 kernel: BTRFS info (device vda6): using free space tree Feb 13 16:17:39.399722 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Feb 13 16:17:39.402321 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 16:17:39.402453 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 16:17:39.410133 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 16:17:39.421989 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 16:17:39.428297 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 16:17:39.440657 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 16:17:39.561206 initrd-setup-root[818]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 16:17:39.580396 coreos-metadata[791]: Feb 13 16:17:39.578 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 16:17:39.584588 initrd-setup-root[825]: cut: /sysroot/etc/group: No such file or directory Feb 13 16:17:39.588466 coreos-metadata[790]: Feb 13 16:17:39.584 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 16:17:39.598554 coreos-metadata[791]: Feb 13 16:17:39.597 INFO Fetch successful Feb 13 16:17:39.599742 initrd-setup-root[832]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 16:17:39.606803 coreos-metadata[791]: Feb 13 16:17:39.606 INFO wrote hostname ci-4152.2.1-0-f194220f8f to /sysroot/etc/hostname Feb 13 16:17:39.609642 coreos-metadata[790]: Feb 13 16:17:39.607 INFO Fetch successful Feb 13 16:17:39.610456 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Feb 13 16:17:39.614568 systemd[1]: flatcar-afterburn-network.service: Deactivated successfully. Feb 13 16:17:39.614772 systemd[1]: Finished flatcar-afterburn-network.service - Flatcar Afterburn network service. Feb 13 16:17:39.623202 initrd-setup-root[841]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 16:17:39.832442 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 16:17:39.840471 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 16:17:39.844891 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 16:17:39.866266 kernel: BTRFS info (device vda6): last unmount of filesystem 83f602a1-06be-4b8b-b461-5e4f70db8da1 Feb 13 16:17:39.867339 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 16:17:39.913900 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 16:17:39.919409 ignition[908]: INFO : Ignition 2.20.0 Feb 13 16:17:39.919409 ignition[908]: INFO : Stage: mount Feb 13 16:17:39.919409 ignition[908]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 16:17:39.919409 ignition[908]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 16:17:39.929003 ignition[908]: INFO : mount: mount passed Feb 13 16:17:39.929003 ignition[908]: INFO : Ignition finished successfully Feb 13 16:17:39.922049 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 16:17:39.936458 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 16:17:39.966132 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 16:17:39.988290 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (921) Feb 13 16:17:39.995359 kernel: BTRFS info (device vda6): first mount of filesystem 83f602a1-06be-4b8b-b461-5e4f70db8da1 Feb 13 16:17:39.996916 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:17:39.997002 kernel: BTRFS info (device vda6): using free space tree Feb 13 16:17:40.007399 kernel: BTRFS info (device vda6): auto enabling async discard Feb 13 16:17:40.011993 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 16:17:40.058279 ignition[938]: INFO : Ignition 2.20.0 Feb 13 16:17:40.058279 ignition[938]: INFO : Stage: files Feb 13 16:17:40.058279 ignition[938]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 16:17:40.058279 ignition[938]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 16:17:40.065756 ignition[938]: DEBUG : files: compiled without relabeling support, skipping Feb 13 16:17:40.069448 ignition[938]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 16:17:40.069448 ignition[938]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 16:17:40.073419 ignition[938]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 16:17:40.074786 ignition[938]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 16:17:40.074786 ignition[938]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 16:17:40.073980 unknown[938]: wrote ssh authorized keys file for user: core Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/home/core/install.sh" Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Feb 13 16:17:40.078363 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Feb 13 16:17:40.303034 systemd-networkd[749]: eth1: Gained IPv6LL Feb 13 16:17:40.367108 systemd-networkd[749]: eth0: Gained IPv6LL Feb 13 16:17:40.576072 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(6): GET result: OK Feb 13 16:17:40.987001 ignition[938]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Feb 13 16:17:40.988535 ignition[938]: INFO : files: createResultFile: createFiles: op(7): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 16:17:40.988535 ignition[938]: INFO : files: createResultFile: createFiles: op(7): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 16:17:40.988535 ignition[938]: INFO : files: files passed Feb 13 16:17:40.988535 ignition[938]: INFO : Ignition finished successfully Feb 13 16:17:40.989985 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 16:17:40.999039 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 16:17:41.009477 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 16:17:41.015000 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 16:17:41.015168 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 16:17:41.038500 initrd-setup-root-after-ignition[967]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 16:17:41.038500 initrd-setup-root-after-ignition[967]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 16:17:41.040696 initrd-setup-root-after-ignition[971]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 16:17:41.043759 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 16:17:41.044885 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 16:17:41.055096 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 16:17:41.097819 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 16:17:41.098049 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 16:17:41.100023 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 16:17:41.101269 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 16:17:41.104061 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 16:17:41.109603 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 16:17:41.140148 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 16:17:41.145531 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 16:17:41.165106 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 16:17:41.167437 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 16:17:41.169594 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 16:17:41.171404 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 16:17:41.171632 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 16:17:41.173406 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 16:17:41.174514 systemd[1]: Stopped target basic.target - Basic System. Feb 13 16:17:41.176188 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 16:17:41.177920 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 16:17:41.179274 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 16:17:41.180673 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 16:17:41.182512 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 16:17:41.184108 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 16:17:41.189636 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 16:17:41.190433 systemd[1]: Stopped target swap.target - Swaps. Feb 13 16:17:41.192724 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 16:17:41.192949 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 16:17:41.203685 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 16:17:41.204887 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 16:17:41.206127 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 16:17:41.206296 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 16:17:41.209121 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 16:17:41.209433 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 16:17:41.211564 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 16:17:41.211867 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 16:17:41.213920 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 16:17:41.214105 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 16:17:41.215407 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Feb 13 16:17:41.215656 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Feb 13 16:17:41.224724 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 16:17:41.228381 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 16:17:41.230793 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 16:17:41.231053 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 16:17:41.236519 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 16:17:41.236772 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 16:17:41.246647 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 16:17:41.247344 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 16:17:41.260299 ignition[991]: INFO : Ignition 2.20.0 Feb 13 16:17:41.260299 ignition[991]: INFO : Stage: umount Feb 13 16:17:41.260299 ignition[991]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 16:17:41.260299 ignition[991]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" Feb 13 16:17:41.273522 ignition[991]: INFO : umount: umount passed Feb 13 16:17:41.273522 ignition[991]: INFO : Ignition finished successfully Feb 13 16:17:41.262837 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 16:17:41.263064 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 16:17:41.265869 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 16:17:41.266018 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 16:17:41.270856 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 16:17:41.270955 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 16:17:41.271635 systemd[1]: ignition-fetch.service: Deactivated successfully. Feb 13 16:17:41.271711 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Feb 13 16:17:41.274629 systemd[1]: Stopped target network.target - Network. Feb 13 16:17:41.275577 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 16:17:41.275905 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 16:17:41.279510 systemd[1]: Stopped target paths.target - Path Units. Feb 13 16:17:41.282494 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 16:17:41.283655 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 16:17:41.285382 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 16:17:41.286012 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 16:17:41.286850 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 16:17:41.286945 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 16:17:41.290156 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 16:17:41.290272 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 16:17:41.292602 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 16:17:41.292734 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 16:17:41.293662 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 16:17:41.293765 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 16:17:41.298190 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 16:17:41.300774 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 16:17:41.304327 systemd-networkd[749]: eth1: DHCPv6 lease lost Feb 13 16:17:41.307377 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 16:17:41.309025 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 16:17:41.309201 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 16:17:41.312405 systemd-networkd[749]: eth0: DHCPv6 lease lost Feb 13 16:17:41.315658 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 16:17:41.315895 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 16:17:41.324604 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 16:17:41.324917 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 16:17:41.334936 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 16:17:41.335028 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 16:17:41.336744 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 16:17:41.336844 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 16:17:41.347505 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 16:17:41.348387 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 16:17:41.348504 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 16:17:41.352772 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 16:17:41.352876 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 16:17:41.354721 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 16:17:41.354814 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 16:17:41.356898 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 16:17:41.357030 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 16:17:41.361448 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 16:17:41.379911 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 16:17:41.383879 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 16:17:41.385652 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 16:17:41.385717 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 16:17:41.391838 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 16:17:41.391934 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 16:17:41.393597 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 16:17:41.393693 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 16:17:41.398012 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 16:17:41.398192 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 16:17:41.401012 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 16:17:41.401117 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:17:41.417489 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 16:17:41.418208 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 16:17:41.418335 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 16:17:41.424839 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Feb 13 16:17:41.424954 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 16:17:41.431382 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 16:17:41.431507 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 16:17:41.432504 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 16:17:41.432624 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:17:41.434358 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 16:17:41.434516 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 16:17:41.442146 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 16:17:41.442646 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 16:17:41.444309 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 16:17:41.452133 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 16:17:41.505443 systemd[1]: Switching root. Feb 13 16:17:41.587003 systemd-journald[184]: Journal stopped Feb 13 16:17:43.481019 systemd-journald[184]: Received SIGTERM from PID 1 (systemd). Feb 13 16:17:43.481161 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 16:17:43.481187 kernel: SELinux: policy capability open_perms=1 Feb 13 16:17:43.481206 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 16:17:43.481248 kernel: SELinux: policy capability always_check_network=0 Feb 13 16:17:43.481267 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 16:17:43.481284 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 16:17:43.481309 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 16:17:43.481332 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 16:17:43.481350 systemd[1]: Successfully loaded SELinux policy in 70.840ms. Feb 13 16:17:43.481387 kernel: audit: type=1403 audit(1739463461.941:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 16:17:43.481413 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 19.062ms. Feb 13 16:17:43.481436 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 16:17:43.481456 systemd[1]: Detected virtualization kvm. Feb 13 16:17:43.481476 systemd[1]: Detected architecture x86-64. Feb 13 16:17:43.481496 systemd[1]: Detected first boot. Feb 13 16:17:43.481519 systemd[1]: Hostname set to . Feb 13 16:17:43.481538 systemd[1]: Initializing machine ID from VM UUID. Feb 13 16:17:43.481557 zram_generator::config[1033]: No configuration found. Feb 13 16:17:43.481579 systemd[1]: Populated /etc with preset unit settings. Feb 13 16:17:43.481605 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 16:17:43.481623 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 16:17:43.481643 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 16:17:43.481666 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 16:17:43.481690 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 16:17:43.481711 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 16:17:43.481734 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 16:17:43.481756 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 16:17:43.481775 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 16:17:43.481794 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 16:17:43.481814 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 16:17:43.481832 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 16:17:43.481852 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 16:17:43.481876 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 16:17:43.481903 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 16:17:43.481924 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 16:17:43.481945 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 16:17:43.481964 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 16:17:43.481984 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 16:17:43.482005 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 16:17:43.482030 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 16:17:43.482049 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 16:17:43.482069 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 16:17:43.482088 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 16:17:43.482107 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 16:17:43.482125 systemd[1]: Reached target slices.target - Slice Units. Feb 13 16:17:43.482145 systemd[1]: Reached target swap.target - Swaps. Feb 13 16:17:43.482165 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 16:17:43.482189 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 16:17:43.482210 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 16:17:43.486354 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 16:17:43.486396 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 16:17:43.486408 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 16:17:43.486420 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 16:17:43.486432 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 16:17:43.486444 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 16:17:43.486456 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:43.486481 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 16:17:43.486494 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 16:17:43.486507 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 16:17:43.486519 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 16:17:43.486531 systemd[1]: Reached target machines.target - Containers. Feb 13 16:17:43.486542 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 16:17:43.486554 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 16:17:43.486565 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 16:17:43.486581 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 16:17:43.486593 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 16:17:43.486605 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 16:17:43.486616 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 16:17:43.486629 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 16:17:43.486642 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 16:17:43.486655 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 16:17:43.486667 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 16:17:43.486678 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 16:17:43.486692 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 16:17:43.486704 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 16:17:43.486715 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 16:17:43.486727 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 16:17:43.486738 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 16:17:43.486750 kernel: loop: module loaded Feb 13 16:17:43.486762 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 16:17:43.486773 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 16:17:43.486786 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 16:17:43.486801 systemd[1]: Stopped verity-setup.service. Feb 13 16:17:43.486814 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:43.486825 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 16:17:43.486837 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 16:17:43.486848 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 16:17:43.486859 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 16:17:43.486871 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 16:17:43.486883 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 16:17:43.486897 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 16:17:43.486912 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 16:17:43.486930 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 16:17:43.486951 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 16:17:43.486967 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 16:17:43.486985 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 16:17:43.487002 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 16:17:43.487019 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 16:17:43.487036 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 16:17:43.487055 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 16:17:43.487074 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 16:17:43.487098 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 16:17:43.487176 systemd-journald[1109]: Collecting audit messages is disabled. Feb 13 16:17:43.487218 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 16:17:43.487289 systemd-journald[1109]: Journal started Feb 13 16:17:43.490197 systemd-journald[1109]: Runtime Journal (/run/log/journal/f6a9ff72ed1f4412ab1bbeae1df04298) is 4.9M, max 39.3M, 34.4M free. Feb 13 16:17:42.959060 systemd[1]: Queued start job for default target multi-user.target. Feb 13 16:17:42.989117 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Feb 13 16:17:43.498700 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 16:17:42.989820 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 16:17:43.507596 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 16:17:43.507741 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 16:17:43.512387 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Feb 13 16:17:43.521718 kernel: fuse: init (API version 7.39) Feb 13 16:17:43.521851 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 16:17:43.552168 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 16:17:43.552303 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 16:17:43.555278 kernel: ACPI: bus type drm_connector registered Feb 13 16:17:43.555397 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 16:17:43.564417 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 16:17:43.576265 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 16:17:43.580283 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 16:17:43.590585 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 16:17:43.604278 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 16:17:43.617637 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 16:17:43.630811 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 16:17:43.633747 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 16:17:43.636045 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 16:17:43.637387 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 16:17:43.647986 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 16:17:43.649502 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 16:17:43.652181 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 16:17:43.655372 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 16:17:43.661426 kernel: loop0: detected capacity change from 0 to 8 Feb 13 16:17:43.681269 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 16:17:43.721334 kernel: loop1: detected capacity change from 0 to 211296 Feb 13 16:17:43.782015 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 16:17:43.783608 systemd-tmpfiles[1135]: ACLs are not supported, ignoring. Feb 13 16:17:43.783630 systemd-tmpfiles[1135]: ACLs are not supported, ignoring. Feb 13 16:17:43.799108 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 16:17:43.805070 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 16:17:43.816296 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 16:17:43.843051 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 16:17:43.850397 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 16:17:43.851947 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 16:17:43.859152 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 16:17:43.877628 systemd-journald[1109]: Time spent on flushing to /var/log/journal/f6a9ff72ed1f4412ab1bbeae1df04298 is 43.767ms for 982 entries. Feb 13 16:17:43.877628 systemd-journald[1109]: System Journal (/var/log/journal/f6a9ff72ed1f4412ab1bbeae1df04298) is 8.0M, max 195.6M, 187.6M free. Feb 13 16:17:43.944421 systemd-journald[1109]: Received client request to flush runtime journal. Feb 13 16:17:43.944519 kernel: loop2: detected capacity change from 0 to 140992 Feb 13 16:17:43.872712 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Feb 13 16:17:43.958107 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 16:17:43.992727 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 16:17:43.993759 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Feb 13 16:17:44.000423 kernel: loop3: detected capacity change from 0 to 138184 Feb 13 16:17:44.093769 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 16:17:44.099131 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 16:17:44.108351 kernel: loop4: detected capacity change from 0 to 8 Feb 13 16:17:44.123865 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 16:17:44.147284 kernel: loop5: detected capacity change from 0 to 211296 Feb 13 16:17:44.132488 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 16:17:44.193274 kernel: loop6: detected capacity change from 0 to 140992 Feb 13 16:17:44.242312 kernel: loop7: detected capacity change from 0 to 138184 Feb 13 16:17:44.244021 systemd-tmpfiles[1179]: ACLs are not supported, ignoring. Feb 13 16:17:44.246113 systemd-tmpfiles[1179]: ACLs are not supported, ignoring. Feb 13 16:17:44.277880 udevadm[1180]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Feb 13 16:17:44.278123 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 16:17:44.302311 (sd-merge)[1176]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-digitalocean'. Feb 13 16:17:44.306100 (sd-merge)[1176]: Merged extensions into '/usr'. Feb 13 16:17:44.320806 systemd[1]: Reloading requested from client PID 1134 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 16:17:44.321077 systemd[1]: Reloading... Feb 13 16:17:44.542987 zram_generator::config[1214]: No configuration found. Feb 13 16:17:44.833348 ldconfig[1130]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 16:17:44.923346 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:17:45.000270 systemd[1]: Reloading finished in 678 ms. Feb 13 16:17:45.025900 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 16:17:45.028425 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 16:17:45.043666 systemd[1]: Starting ensure-sysext.service... Feb 13 16:17:45.054841 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 16:17:45.082477 systemd[1]: Reloading requested from client PID 1251 ('systemctl') (unit ensure-sysext.service)... Feb 13 16:17:45.082504 systemd[1]: Reloading... Feb 13 16:17:45.138403 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 16:17:45.139696 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 16:17:45.143412 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 16:17:45.144065 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Feb 13 16:17:45.144293 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Feb 13 16:17:45.152409 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 16:17:45.152627 systemd-tmpfiles[1252]: Skipping /boot Feb 13 16:17:45.189037 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 16:17:45.190444 systemd-tmpfiles[1252]: Skipping /boot Feb 13 16:17:45.254306 zram_generator::config[1278]: No configuration found. Feb 13 16:17:45.494738 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:17:45.564124 systemd[1]: Reloading finished in 480 ms. Feb 13 16:17:45.590443 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 16:17:45.596184 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 16:17:45.618604 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 16:17:45.624886 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 16:17:45.631563 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 16:17:45.644425 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 16:17:45.651776 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 16:17:45.665115 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 16:17:45.681083 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:45.681434 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 16:17:45.690628 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 16:17:45.695986 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 16:17:45.713720 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 16:17:45.715862 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 16:17:45.721795 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 16:17:45.722388 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:45.730839 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:45.731437 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 16:17:45.738639 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 16:17:45.740505 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 16:17:45.740682 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:45.747733 systemd[1]: Finished ensure-sysext.service. Feb 13 16:17:45.748890 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 16:17:45.758525 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 16:17:45.769505 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 16:17:45.784670 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 16:17:45.798013 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 16:17:45.798339 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 16:17:45.828812 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 16:17:45.829147 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 16:17:45.830744 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 16:17:45.832551 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 16:17:45.833918 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 16:17:45.839974 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 16:17:45.840077 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 16:17:45.849628 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 16:17:45.850615 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 16:17:45.868072 systemd-udevd[1331]: Using default interface naming scheme 'v255'. Feb 13 16:17:45.875943 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 16:17:45.878515 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 16:17:45.883891 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 16:17:45.887424 augenrules[1365]: No rules Feb 13 16:17:45.889729 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 16:17:45.890031 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 16:17:45.910601 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 16:17:45.923783 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 16:17:45.997378 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 16:17:46.001972 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 16:17:46.108384 systemd[1]: Mounting media-configdrive.mount - /media/configdrive... Feb 13 16:17:46.109007 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:46.109206 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Feb 13 16:17:46.116762 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 16:17:46.126445 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 16:17:46.130763 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 16:17:46.131005 systemd-resolved[1327]: Positive Trust Anchors: Feb 13 16:17:46.131022 systemd-resolved[1327]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 16:17:46.131071 systemd-resolved[1327]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 16:17:46.132425 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 16:17:46.132502 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 16:17:46.132528 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:17:46.139459 systemd-networkd[1377]: lo: Link UP Feb 13 16:17:46.139474 systemd-networkd[1377]: lo: Gained carrier Feb 13 16:17:46.144191 systemd-networkd[1377]: Enumeration completed Feb 13 16:17:46.151082 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 16:17:46.164517 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 16:17:46.172972 systemd-resolved[1327]: Using system hostname 'ci-4152.2.1-0-f194220f8f'. Feb 13 16:17:46.177280 kernel: ISO 9660 Extensions: RRIP_1991A Feb 13 16:17:46.187604 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 16:17:46.188784 systemd[1]: Mounted media-configdrive.mount - /media/configdrive. Feb 13 16:17:46.190734 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 16:17:46.192358 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 16:17:46.197409 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1384) Feb 13 16:17:46.198052 systemd[1]: Reached target network.target - Network. Feb 13 16:17:46.201886 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 16:17:46.203142 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 16:17:46.217424 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 16:17:46.225435 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 16:17:46.225912 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 16:17:46.229636 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 16:17:46.229903 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 16:17:46.233139 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 16:17:46.323287 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Feb 13 16:17:46.330305 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Feb 13 16:17:46.332392 systemd-networkd[1377]: eth0: Configuring with /run/systemd/network/10-2e:73:4a:27:76:7c.network. Feb 13 16:17:46.335151 systemd-networkd[1377]: eth0: Link UP Feb 13 16:17:46.335162 systemd-networkd[1377]: eth0: Gained carrier Feb 13 16:17:46.337538 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Feb 13 16:17:46.340196 systemd-timesyncd[1345]: Network configuration changed, trying to establish connection. Feb 13 16:17:46.343757 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 16:17:46.347203 systemd-networkd[1377]: eth1: Configuring with /run/systemd/network/10-56:91:61:76:44:d2.network. Feb 13 16:17:46.349179 systemd-networkd[1377]: eth1: Link UP Feb 13 16:17:46.349192 systemd-networkd[1377]: eth1: Gained carrier Feb 13 16:17:46.349807 systemd-timesyncd[1345]: Network configuration changed, trying to establish connection. Feb 13 16:17:46.353263 kernel: ACPI: button: Power Button [PWRF] Feb 13 16:17:46.355906 systemd-timesyncd[1345]: Network configuration changed, trying to establish connection. Feb 13 16:17:46.357374 systemd-timesyncd[1345]: Network configuration changed, trying to establish connection. Feb 13 16:17:46.384983 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 16:17:46.412284 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Feb 13 16:17:46.466600 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 16:17:46.489764 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:17:46.568322 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Feb 13 16:17:46.568410 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Feb 13 16:17:46.584463 kernel: Console: switching to colour dummy device 80x25 Feb 13 16:17:46.587049 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Feb 13 16:17:46.587170 kernel: [drm] features: -context_init Feb 13 16:17:46.593824 kernel: [drm] number of scanouts: 1 Feb 13 16:17:46.593972 kernel: [drm] number of cap sets: 0 Feb 13 16:17:46.602274 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Feb 13 16:17:46.615164 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 16:17:46.615695 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:17:46.625480 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Feb 13 16:17:46.630827 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:17:46.633321 kernel: Console: switching to colour frame buffer device 128x48 Feb 13 16:17:46.644294 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Feb 13 16:17:46.661281 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 16:17:46.661543 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:17:46.675414 kernel: EDAC MC: Ver: 3.0.0 Feb 13 16:17:46.672822 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:17:46.712593 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 16:17:46.725817 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 16:17:46.759480 lvm[1436]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 16:17:46.781118 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:17:46.814212 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 16:17:46.817050 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 16:17:46.817873 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 16:17:46.818185 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 16:17:46.818437 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 16:17:46.818854 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 16:17:46.819187 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 16:17:46.819424 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 16:17:46.819573 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 16:17:46.819603 systemd[1]: Reached target paths.target - Path Units. Feb 13 16:17:46.819673 systemd[1]: Reached target timers.target - Timer Units. Feb 13 16:17:46.824099 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 16:17:46.837366 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 16:17:46.861905 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 16:17:46.865116 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 16:17:46.873411 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 16:17:46.874103 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 16:17:46.874792 systemd[1]: Reached target basic.target - Basic System. Feb 13 16:17:46.875972 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 16:17:46.876017 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 16:17:46.896454 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 16:17:46.901534 lvm[1442]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 16:17:46.907787 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Feb 13 16:17:46.924707 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 16:17:46.932389 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 16:17:46.937311 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 16:17:46.937858 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 16:17:46.947620 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 16:17:46.957558 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 16:17:46.967536 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 16:17:46.986195 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 16:17:46.989493 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 16:17:46.990312 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 16:17:46.996533 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 16:17:47.000601 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 16:17:47.003092 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 16:17:47.009509 jq[1448]: false Feb 13 16:17:47.014912 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 16:17:47.015172 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 16:17:47.038912 coreos-metadata[1444]: Feb 13 16:17:47.038 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 16:17:47.059139 coreos-metadata[1444]: Feb 13 16:17:47.054 INFO Fetch successful Feb 13 16:17:47.059369 jq[1455]: true Feb 13 16:17:47.066917 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 16:17:47.067241 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 16:17:47.074714 extend-filesystems[1449]: Found loop4 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found loop5 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found loop6 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found loop7 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda1 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda2 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda3 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found usr Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda4 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda6 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda7 Feb 13 16:17:47.074714 extend-filesystems[1449]: Found vda9 Feb 13 16:17:47.074714 extend-filesystems[1449]: Checking size of /dev/vda9 Feb 13 16:17:47.088444 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 16:17:47.167878 update_engine[1454]: I20250213 16:17:47.148598 1454 main.cc:92] Flatcar Update Engine starting Feb 13 16:17:47.087426 dbus-daemon[1445]: [system] SELinux support is enabled Feb 13 16:17:47.115512 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 16:17:47.192316 extend-filesystems[1449]: Resized partition /dev/vda9 Feb 13 16:17:47.202831 update_engine[1454]: I20250213 16:17:47.190563 1454 update_check_scheduler.cc:74] Next update check in 5m42s Feb 13 16:17:47.202930 jq[1463]: true Feb 13 16:17:47.115564 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 16:17:47.133138 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 16:17:47.133268 systemd[1]: user-configdrive.service - Load cloud-config from /media/configdrive was skipped because of an unmet condition check (ConditionKernelCommandLine=!flatcar.oem.id=digitalocean). Feb 13 16:17:47.133306 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 16:17:47.138936 (ntainerd)[1474]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 16:17:47.139664 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 16:17:47.139955 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 16:17:47.178413 systemd[1]: Started update-engine.service - Update Engine. Feb 13 16:17:47.204523 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 16:17:47.206820 extend-filesystems[1485]: resize2fs 1.47.1 (20-May-2024) Feb 13 16:17:47.228294 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 15121403 blocks Feb 13 16:17:47.239845 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Feb 13 16:17:47.244909 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 16:17:47.280793 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (1381) Feb 13 16:17:47.404501 bash[1503]: Updated "/home/core/.ssh/authorized_keys" Feb 13 16:17:47.407153 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 16:17:47.437538 systemd[1]: Starting sshkeys.service... Feb 13 16:17:47.449187 systemd-logind[1453]: New seat seat0. Feb 13 16:17:47.475888 systemd-logind[1453]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 16:17:47.475924 systemd-logind[1453]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 16:17:47.476343 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 16:17:47.504926 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Feb 13 16:17:47.533561 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Feb 13 16:17:47.554697 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Feb 13 16:17:47.598682 systemd-networkd[1377]: eth0: Gained IPv6LL Feb 13 16:17:47.599509 systemd-timesyncd[1345]: Network configuration changed, trying to establish connection. Feb 13 16:17:47.604221 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 16:17:47.610102 extend-filesystems[1485]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Feb 13 16:17:47.610102 extend-filesystems[1485]: old_desc_blocks = 1, new_desc_blocks = 8 Feb 13 16:17:47.610102 extend-filesystems[1485]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Feb 13 16:17:47.613604 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 16:17:47.651134 extend-filesystems[1449]: Resized filesystem in /dev/vda9 Feb 13 16:17:47.651134 extend-filesystems[1449]: Found vdb Feb 13 16:17:47.614133 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 16:17:47.621100 locksmithd[1487]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 16:17:47.649822 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 16:17:47.671924 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:17:47.686940 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 16:17:47.771388 coreos-metadata[1510]: Feb 13 16:17:47.771 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 Feb 13 16:17:47.793295 coreos-metadata[1510]: Feb 13 16:17:47.791 INFO Fetch successful Feb 13 16:17:47.794366 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 16:17:47.798770 sshd_keygen[1476]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 16:17:47.819210 unknown[1510]: wrote ssh authorized keys file for user: core Feb 13 16:17:47.855426 systemd-networkd[1377]: eth1: Gained IPv6LL Feb 13 16:17:47.857802 systemd-timesyncd[1345]: Network configuration changed, trying to establish connection. Feb 13 16:17:47.880445 update-ssh-keys[1535]: Updated "/home/core/.ssh/authorized_keys" Feb 13 16:17:47.874355 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Feb 13 16:17:47.886551 systemd[1]: Finished sshkeys.service. Feb 13 16:17:47.926941 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 16:17:47.941620 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 16:17:47.982112 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 16:17:47.982416 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 16:17:47.992616 containerd[1474]: time="2025-02-13T16:17:47.992522140Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 16:17:47.996893 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 16:17:48.028742 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 16:17:48.040676 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 16:17:48.049813 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 16:17:48.054076 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 16:17:48.070720 containerd[1474]: time="2025-02-13T16:17:48.070640312Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:17:48.073555 containerd[1474]: time="2025-02-13T16:17:48.073478073Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:17:48.073746 containerd[1474]: time="2025-02-13T16:17:48.073718601Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 16:17:48.073838 containerd[1474]: time="2025-02-13T16:17:48.073820496Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 16:17:48.074141 containerd[1474]: time="2025-02-13T16:17:48.074115045Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 16:17:48.074291 containerd[1474]: time="2025-02-13T16:17:48.074272284Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 16:17:48.074433 containerd[1474]: time="2025-02-13T16:17:48.074414423Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:17:48.074492 containerd[1474]: time="2025-02-13T16:17:48.074480089Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:17:48.074786 containerd[1474]: time="2025-02-13T16:17:48.074764369Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:17:48.074864 containerd[1474]: time="2025-02-13T16:17:48.074850146Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 16:17:48.074930 containerd[1474]: time="2025-02-13T16:17:48.074913765Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:17:48.075004 containerd[1474]: time="2025-02-13T16:17:48.074988204Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 16:17:48.075186 containerd[1474]: time="2025-02-13T16:17:48.075166313Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:17:48.075614 containerd[1474]: time="2025-02-13T16:17:48.075589870Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:17:48.075892 containerd[1474]: time="2025-02-13T16:17:48.075864740Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:17:48.076596 containerd[1474]: time="2025-02-13T16:17:48.075955248Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 16:17:48.076596 containerd[1474]: time="2025-02-13T16:17:48.076074116Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 16:17:48.076596 containerd[1474]: time="2025-02-13T16:17:48.076139176Z" level=info msg="metadata content store policy set" policy=shared Feb 13 16:17:48.086271 containerd[1474]: time="2025-02-13T16:17:48.086188320Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.086490205Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.086527113Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.086559095Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.086584456Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.086871085Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087395630Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087660406Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087690333Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087718326Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087742470Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087766494Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087788128Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087819967Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088132 containerd[1474]: time="2025-02-13T16:17:48.087845626Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.087867647Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.087888676Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.087912099Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.087946856Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.087971785Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.087993471Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.088017835Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.088040586Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.088705 containerd[1474]: time="2025-02-13T16:17:48.088062994Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089055 containerd[1474]: time="2025-02-13T16:17:48.089011985Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089141 containerd[1474]: time="2025-02-13T16:17:48.089124131Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089213 containerd[1474]: time="2025-02-13T16:17:48.089197462Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089336 containerd[1474]: time="2025-02-13T16:17:48.089296234Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089467 containerd[1474]: time="2025-02-13T16:17:48.089417443Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089578 containerd[1474]: time="2025-02-13T16:17:48.089558461Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089646 containerd[1474]: time="2025-02-13T16:17:48.089632802Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089739 containerd[1474]: time="2025-02-13T16:17:48.089720619Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 16:17:48.089893 containerd[1474]: time="2025-02-13T16:17:48.089873662Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.089985 containerd[1474]: time="2025-02-13T16:17:48.089968686Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.090066 containerd[1474]: time="2025-02-13T16:17:48.090038962Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 16:17:48.090309 containerd[1474]: time="2025-02-13T16:17:48.090286895Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 16:17:48.091277 containerd[1474]: time="2025-02-13T16:17:48.090409024Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 16:17:48.091277 containerd[1474]: time="2025-02-13T16:17:48.090438845Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 16:17:48.091277 containerd[1474]: time="2025-02-13T16:17:48.090459573Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 16:17:48.091277 containerd[1474]: time="2025-02-13T16:17:48.090476052Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.091277 containerd[1474]: time="2025-02-13T16:17:48.090498721Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 16:17:48.091277 containerd[1474]: time="2025-02-13T16:17:48.090517349Z" level=info msg="NRI interface is disabled by configuration." Feb 13 16:17:48.091277 containerd[1474]: time="2025-02-13T16:17:48.090535155Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 16:17:48.091539 containerd[1474]: time="2025-02-13T16:17:48.090991883Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 16:17:48.091539 containerd[1474]: time="2025-02-13T16:17:48.091073957Z" level=info msg="Connect containerd service" Feb 13 16:17:48.091539 containerd[1474]: time="2025-02-13T16:17:48.091142855Z" level=info msg="using legacy CRI server" Feb 13 16:17:48.091539 containerd[1474]: time="2025-02-13T16:17:48.091154528Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 16:17:48.092150 containerd[1474]: time="2025-02-13T16:17:48.092119341Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 16:17:48.093445 containerd[1474]: time="2025-02-13T16:17:48.093408287Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 16:17:48.093741 containerd[1474]: time="2025-02-13T16:17:48.093691746Z" level=info msg="Start subscribing containerd event" Feb 13 16:17:48.093867 containerd[1474]: time="2025-02-13T16:17:48.093845660Z" level=info msg="Start recovering state" Feb 13 16:17:48.094397 containerd[1474]: time="2025-02-13T16:17:48.094017513Z" level=info msg="Start event monitor" Feb 13 16:17:48.094397 containerd[1474]: time="2025-02-13T16:17:48.094049274Z" level=info msg="Start snapshots syncer" Feb 13 16:17:48.094397 containerd[1474]: time="2025-02-13T16:17:48.094068419Z" level=info msg="Start cni network conf syncer for default" Feb 13 16:17:48.094397 containerd[1474]: time="2025-02-13T16:17:48.094080211Z" level=info msg="Start streaming server" Feb 13 16:17:48.095175 containerd[1474]: time="2025-02-13T16:17:48.095148716Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 16:17:48.095526 containerd[1474]: time="2025-02-13T16:17:48.095450401Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 16:17:48.096431 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 16:17:48.102424 containerd[1474]: time="2025-02-13T16:17:48.102358153Z" level=info msg="containerd successfully booted in 0.110734s" Feb 13 16:17:49.160177 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:17:49.166107 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 16:17:49.170284 systemd[1]: Startup finished in 1.737s (kernel) + 7.130s (initrd) + 7.298s (userspace) = 16.166s. Feb 13 16:17:49.175926 (kubelet)[1559]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:17:50.354807 kubelet[1559]: E0213 16:17:50.354611 1559 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:17:50.359471 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:17:50.359684 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:17:50.360038 systemd[1]: kubelet.service: Consumed 1.696s CPU time. Feb 13 16:17:51.274084 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 16:17:51.281733 systemd[1]: Started sshd@0-64.227.101.255:22-155.4.245.222:55605.service - OpenSSH per-connection server daemon (155.4.245.222:55605). Feb 13 16:17:53.103069 sshd[1573]: Invalid user test from 155.4.245.222 port 55605 Feb 13 16:17:53.274309 sshd[1573]: Received disconnect from 155.4.245.222 port 55605:11: Bye Bye [preauth] Feb 13 16:17:53.274309 sshd[1573]: Disconnected from invalid user test 155.4.245.222 port 55605 [preauth] Feb 13 16:17:53.277149 systemd[1]: sshd@0-64.227.101.255:22-155.4.245.222:55605.service: Deactivated successfully. Feb 13 16:17:56.546282 systemd[1]: Started sshd@1-64.227.101.255:22-139.178.89.65:34812.service - OpenSSH per-connection server daemon (139.178.89.65:34812). Feb 13 16:17:56.618980 sshd[1578]: Accepted publickey for core from 139.178.89.65 port 34812 ssh2: RSA SHA256:AMPu2lZjn4SqDYANHPtTget7vBQBooUjf0mriNIzIUY Feb 13 16:17:56.622839 sshd-session[1578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:17:56.641385 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 16:17:56.642671 systemd-logind[1453]: New session 1 of user core. Feb 13 16:17:56.650588 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 16:17:56.685911 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 16:17:56.705373 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 16:17:56.710939 (systemd)[1582]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 16:17:56.898440 systemd[1582]: Queued start job for default target default.target. Feb 13 16:17:56.917364 systemd[1582]: Created slice app.slice - User Application Slice. Feb 13 16:17:56.917433 systemd[1582]: Reached target paths.target - Paths. Feb 13 16:17:56.917460 systemd[1582]: Reached target timers.target - Timers. Feb 13 16:17:56.920734 systemd[1582]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 16:17:56.940461 systemd[1582]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 16:17:56.941574 systemd[1582]: Reached target sockets.target - Sockets. Feb 13 16:17:56.941605 systemd[1582]: Reached target basic.target - Basic System. Feb 13 16:17:56.941677 systemd[1582]: Reached target default.target - Main User Target. Feb 13 16:17:56.941714 systemd[1582]: Startup finished in 217ms. Feb 13 16:17:56.941913 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 16:17:56.951264 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 16:17:57.034012 systemd[1]: Started sshd@2-64.227.101.255:22-139.178.89.65:34828.service - OpenSSH per-connection server daemon (139.178.89.65:34828). Feb 13 16:17:57.141544 sshd[1593]: Accepted publickey for core from 139.178.89.65 port 34828 ssh2: RSA SHA256:AMPu2lZjn4SqDYANHPtTget7vBQBooUjf0mriNIzIUY Feb 13 16:17:57.144585 sshd-session[1593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:17:57.153353 systemd-logind[1453]: New session 2 of user core. Feb 13 16:17:57.156876 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 16:17:57.239336 sshd[1595]: Connection closed by 139.178.89.65 port 34828 Feb 13 16:17:57.240607 sshd-session[1593]: pam_unix(sshd:session): session closed for user core Feb 13 16:17:57.260402 systemd[1]: sshd@2-64.227.101.255:22-139.178.89.65:34828.service: Deactivated successfully. Feb 13 16:17:57.263766 systemd[1]: session-2.scope: Deactivated successfully. Feb 13 16:17:57.270615 systemd-logind[1453]: Session 2 logged out. Waiting for processes to exit. Feb 13 16:17:57.275801 systemd[1]: Started sshd@3-64.227.101.255:22-139.178.89.65:34834.service - OpenSSH per-connection server daemon (139.178.89.65:34834). Feb 13 16:17:57.277962 systemd-logind[1453]: Removed session 2. Feb 13 16:17:57.368289 sshd[1600]: Accepted publickey for core from 139.178.89.65 port 34834 ssh2: RSA SHA256:AMPu2lZjn4SqDYANHPtTget7vBQBooUjf0mriNIzIUY Feb 13 16:17:57.370498 sshd-session[1600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:17:57.384091 systemd-logind[1453]: New session 3 of user core. Feb 13 16:17:57.393602 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 16:17:57.469965 sshd[1602]: Connection closed by 139.178.89.65 port 34834 Feb 13 16:17:57.470638 sshd-session[1600]: pam_unix(sshd:session): session closed for user core Feb 13 16:17:57.499197 systemd[1]: sshd@3-64.227.101.255:22-139.178.89.65:34834.service: Deactivated successfully. Feb 13 16:17:57.502916 systemd[1]: session-3.scope: Deactivated successfully. Feb 13 16:17:57.510990 systemd-logind[1453]: Session 3 logged out. Waiting for processes to exit. Feb 13 16:17:57.516840 systemd[1]: Started sshd@4-64.227.101.255:22-139.178.89.65:34842.service - OpenSSH per-connection server daemon (139.178.89.65:34842). Feb 13 16:17:57.519345 systemd-logind[1453]: Removed session 3. Feb 13 16:17:57.580276 sshd[1607]: Accepted publickey for core from 139.178.89.65 port 34842 ssh2: RSA SHA256:AMPu2lZjn4SqDYANHPtTget7vBQBooUjf0mriNIzIUY Feb 13 16:17:57.583270 sshd-session[1607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:17:57.590382 systemd-logind[1453]: New session 4 of user core. Feb 13 16:17:57.600653 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 16:17:57.670425 sshd[1609]: Connection closed by 139.178.89.65 port 34842 Feb 13 16:17:57.671183 sshd-session[1607]: pam_unix(sshd:session): session closed for user core Feb 13 16:17:57.681757 systemd[1]: sshd@4-64.227.101.255:22-139.178.89.65:34842.service: Deactivated successfully. Feb 13 16:17:57.684440 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 16:17:57.688564 systemd-logind[1453]: Session 4 logged out. Waiting for processes to exit. Feb 13 16:17:57.696540 systemd[1]: Started sshd@5-64.227.101.255:22-139.178.89.65:34850.service - OpenSSH per-connection server daemon (139.178.89.65:34850). Feb 13 16:17:57.701044 systemd-logind[1453]: Removed session 4. Feb 13 16:17:57.777394 sshd[1614]: Accepted publickey for core from 139.178.89.65 port 34850 ssh2: RSA SHA256:AMPu2lZjn4SqDYANHPtTget7vBQBooUjf0mriNIzIUY Feb 13 16:17:57.779731 sshd-session[1614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:17:57.790137 systemd-logind[1453]: New session 5 of user core. Feb 13 16:17:57.797609 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 16:17:57.907017 sudo[1617]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 16:17:57.916080 sudo[1617]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:17:57.933970 sudo[1617]: pam_unix(sudo:session): session closed for user root Feb 13 16:17:57.939150 sshd[1616]: Connection closed by 139.178.89.65 port 34850 Feb 13 16:17:57.940991 sshd-session[1614]: pam_unix(sshd:session): session closed for user core Feb 13 16:17:57.959604 systemd[1]: sshd@5-64.227.101.255:22-139.178.89.65:34850.service: Deactivated successfully. Feb 13 16:17:57.966660 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 16:17:57.973667 systemd-logind[1453]: Session 5 logged out. Waiting for processes to exit. Feb 13 16:17:57.981797 systemd[1]: Started sshd@6-64.227.101.255:22-139.178.89.65:34862.service - OpenSSH per-connection server daemon (139.178.89.65:34862). Feb 13 16:17:57.985885 systemd-logind[1453]: Removed session 5. Feb 13 16:17:58.054180 sshd[1622]: Accepted publickey for core from 139.178.89.65 port 34862 ssh2: RSA SHA256:AMPu2lZjn4SqDYANHPtTget7vBQBooUjf0mriNIzIUY Feb 13 16:17:58.057500 sshd-session[1622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:17:58.066897 systemd-logind[1453]: New session 6 of user core. Feb 13 16:17:58.073801 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 16:17:58.153838 sudo[1626]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 16:17:58.154420 sudo[1626]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:17:58.163091 sudo[1626]: pam_unix(sudo:session): session closed for user root Feb 13 16:17:58.174651 sudo[1625]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 16:17:58.175035 sudo[1625]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:17:58.204186 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 16:17:58.274118 augenrules[1648]: No rules Feb 13 16:17:58.275655 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 16:17:58.276220 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 16:17:58.278809 sudo[1625]: pam_unix(sudo:session): session closed for user root Feb 13 16:17:58.284262 sshd[1624]: Connection closed by 139.178.89.65 port 34862 Feb 13 16:17:58.284936 sshd-session[1622]: pam_unix(sshd:session): session closed for user core Feb 13 16:17:58.299916 systemd[1]: sshd@6-64.227.101.255:22-139.178.89.65:34862.service: Deactivated successfully. Feb 13 16:17:58.304275 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 16:17:58.310711 systemd-logind[1453]: Session 6 logged out. Waiting for processes to exit. Feb 13 16:17:58.321686 systemd[1]: Started sshd@7-64.227.101.255:22-139.178.89.65:34878.service - OpenSSH per-connection server daemon (139.178.89.65:34878). Feb 13 16:17:58.323714 systemd-logind[1453]: Removed session 6. Feb 13 16:17:58.428597 sshd[1656]: Accepted publickey for core from 139.178.89.65 port 34878 ssh2: RSA SHA256:AMPu2lZjn4SqDYANHPtTget7vBQBooUjf0mriNIzIUY Feb 13 16:17:58.431624 sshd-session[1656]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:17:58.446333 systemd-logind[1453]: New session 7 of user core. Feb 13 16:17:58.457639 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 16:17:58.536723 sudo[1659]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 16:17:58.538013 sudo[1659]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:18:00.085125 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:18:00.086149 systemd[1]: kubelet.service: Consumed 1.696s CPU time. Feb 13 16:18:00.099101 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:18:00.164561 systemd[1]: Reloading requested from client PID 1696 ('systemctl') (unit session-7.scope)... Feb 13 16:18:00.164586 systemd[1]: Reloading... Feb 13 16:18:00.441917 zram_generator::config[1737]: No configuration found. Feb 13 16:18:00.727068 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:18:00.868386 systemd[1]: Reloading finished in 702 ms. Feb 13 16:18:01.003530 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Feb 13 16:18:01.003673 systemd[1]: kubelet.service: Failed with result 'signal'. Feb 13 16:18:01.004557 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:18:01.009134 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:18:01.280599 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:18:01.281132 (kubelet)[1788]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 16:18:01.443207 kubelet[1788]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:18:01.445261 kubelet[1788]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 16:18:01.445261 kubelet[1788]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:18:01.445261 kubelet[1788]: I0213 16:18:01.444009 1788 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 16:18:02.036334 kubelet[1788]: I0213 16:18:02.035535 1788 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Feb 13 16:18:02.036334 kubelet[1788]: I0213 16:18:02.035600 1788 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 16:18:02.036334 kubelet[1788]: I0213 16:18:02.036021 1788 server.go:919] "Client rotation is on, will bootstrap in background" Feb 13 16:18:02.074626 kubelet[1788]: I0213 16:18:02.074368 1788 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 16:18:02.102126 kubelet[1788]: I0213 16:18:02.101999 1788 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 16:18:02.103348 kubelet[1788]: I0213 16:18:02.102706 1788 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 16:18:02.103348 kubelet[1788]: I0213 16:18:02.103170 1788 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Feb 13 16:18:02.103348 kubelet[1788]: I0213 16:18:02.103210 1788 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 16:18:02.103348 kubelet[1788]: I0213 16:18:02.103222 1788 container_manager_linux.go:301] "Creating device plugin manager" Feb 13 16:18:02.107153 kubelet[1788]: I0213 16:18:02.106211 1788 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:18:02.107153 kubelet[1788]: I0213 16:18:02.106504 1788 kubelet.go:396] "Attempting to sync node with API server" Feb 13 16:18:02.107153 kubelet[1788]: I0213 16:18:02.106540 1788 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 16:18:02.107153 kubelet[1788]: I0213 16:18:02.106604 1788 kubelet.go:312] "Adding apiserver pod source" Feb 13 16:18:02.107153 kubelet[1788]: I0213 16:18:02.106629 1788 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 16:18:02.111841 kubelet[1788]: E0213 16:18:02.111790 1788 file.go:98] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:02.111841 kubelet[1788]: E0213 16:18:02.111873 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:02.112387 kubelet[1788]: I0213 16:18:02.112354 1788 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 16:18:02.117217 kubelet[1788]: I0213 16:18:02.117150 1788 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 16:18:02.129287 kubelet[1788]: W0213 16:18:02.129041 1788 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 16:18:02.135965 kubelet[1788]: I0213 16:18:02.135614 1788 server.go:1256] "Started kubelet" Feb 13 16:18:02.136579 kubelet[1788]: I0213 16:18:02.136327 1788 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 16:18:02.138002 kubelet[1788]: I0213 16:18:02.137818 1788 server.go:461] "Adding debug handlers to kubelet server" Feb 13 16:18:02.144730 kubelet[1788]: I0213 16:18:02.144055 1788 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 16:18:02.144730 kubelet[1788]: I0213 16:18:02.144222 1788 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 16:18:02.144730 kubelet[1788]: I0213 16:18:02.144543 1788 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 16:18:02.160529 kubelet[1788]: E0213 16:18:02.159509 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:02.160529 kubelet[1788]: I0213 16:18:02.159585 1788 volume_manager.go:291] "Starting Kubelet Volume Manager" Feb 13 16:18:02.160529 kubelet[1788]: I0213 16:18:02.159801 1788 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Feb 13 16:18:02.160529 kubelet[1788]: I0213 16:18:02.159884 1788 reconciler_new.go:29] "Reconciler: start to sync state" Feb 13 16:18:02.170789 kubelet[1788]: E0213 16:18:02.170560 1788 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 16:18:02.178972 kubelet[1788]: I0213 16:18:02.178727 1788 factory.go:221] Registration of the systemd container factory successfully Feb 13 16:18:02.178972 kubelet[1788]: I0213 16:18:02.178887 1788 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 16:18:02.186324 kubelet[1788]: I0213 16:18:02.184793 1788 factory.go:221] Registration of the containerd container factory successfully Feb 13 16:18:02.190387 kubelet[1788]: E0213 16:18:02.186739 1788 event.go:346] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{64.227.101.255.1823d0d228744262 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:64.227.101.255,UID:64.227.101.255,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:64.227.101.255,},FirstTimestamp:2025-02-13 16:18:02.135560802 +0000 UTC m=+0.830538802,LastTimestamp:2025-02-13 16:18:02.135560802 +0000 UTC m=+0.830538802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:64.227.101.255,}" Feb 13 16:18:02.191972 kubelet[1788]: W0213 16:18:02.191805 1788 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 13 16:18:02.192572 kubelet[1788]: E0213 16:18:02.192542 1788 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 13 16:18:02.196658 kubelet[1788]: W0213 16:18:02.196609 1788 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: nodes "64.227.101.255" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 13 16:18:02.196855 kubelet[1788]: E0213 16:18:02.196839 1788 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: nodes "64.227.101.255" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 13 16:18:02.211264 kubelet[1788]: W0213 16:18:02.208772 1788 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 13 16:18:02.214419 kubelet[1788]: E0213 16:18:02.211655 1788 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 13 16:18:02.214419 kubelet[1788]: E0213 16:18:02.211907 1788 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"64.227.101.255\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Feb 13 16:18:02.236206 kubelet[1788]: I0213 16:18:02.235566 1788 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 16:18:02.236206 kubelet[1788]: I0213 16:18:02.235601 1788 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 16:18:02.236206 kubelet[1788]: I0213 16:18:02.235633 1788 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:18:02.244133 kubelet[1788]: E0213 16:18:02.244086 1788 event.go:346] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{64.227.101.255.1823d0d22a89c17a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:64.227.101.255,UID:64.227.101.255,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:64.227.101.255,},FirstTimestamp:2025-02-13 16:18:02.170524026 +0000 UTC m=+0.865501921,LastTimestamp:2025-02-13 16:18:02.170524026 +0000 UTC m=+0.865501921,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:64.227.101.255,}" Feb 13 16:18:02.250908 kubelet[1788]: I0213 16:18:02.250707 1788 policy_none.go:49] "None policy: Start" Feb 13 16:18:02.253504 kubelet[1788]: I0213 16:18:02.252907 1788 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 16:18:02.253504 kubelet[1788]: I0213 16:18:02.252954 1788 state_mem.go:35] "Initializing new in-memory state store" Feb 13 16:18:02.262129 kubelet[1788]: I0213 16:18:02.262093 1788 kubelet_node_status.go:73] "Attempting to register node" node="64.227.101.255" Feb 13 16:18:02.273159 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 16:18:02.291500 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 16:18:02.305731 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 16:18:02.309850 kubelet[1788]: E0213 16:18:02.308080 1788 event.go:346] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{64.227.101.255.1823d0d22e52612b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:64.227.101.255,UID:64.227.101.255,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node 64.227.101.255 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:64.227.101.255,},FirstTimestamp:2025-02-13 16:18:02.234003755 +0000 UTC m=+0.928981640,LastTimestamp:2025-02-13 16:18:02.234003755 +0000 UTC m=+0.928981640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:64.227.101.255,}" Feb 13 16:18:02.309850 kubelet[1788]: E0213 16:18:02.308972 1788 kubelet_node_status.go:96] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="64.227.101.255" Feb 13 16:18:02.313345 kubelet[1788]: E0213 16:18:02.313003 1788 event.go:346] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{64.227.101.255.1823d0d22e5298e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:64.227.101.255,UID:64.227.101.255,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node 64.227.101.255 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:64.227.101.255,},FirstTimestamp:2025-02-13 16:18:02.234018021 +0000 UTC m=+0.928995901,LastTimestamp:2025-02-13 16:18:02.234018021 +0000 UTC m=+0.928995901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:64.227.101.255,}" Feb 13 16:18:02.316902 kubelet[1788]: I0213 16:18:02.316862 1788 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 16:18:02.318431 kubelet[1788]: I0213 16:18:02.317502 1788 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 16:18:02.321078 kubelet[1788]: E0213 16:18:02.321019 1788 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"64.227.101.255\" not found" Feb 13 16:18:02.336114 kubelet[1788]: E0213 16:18:02.336050 1788 event.go:346] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{64.227.101.255.1823d0d22e52ae21 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:64.227.101.255,UID:64.227.101.255,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node 64.227.101.255 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:64.227.101.255,},FirstTimestamp:2025-02-13 16:18:02.234023457 +0000 UTC m=+0.929001336,LastTimestamp:2025-02-13 16:18:02.234023457 +0000 UTC m=+0.929001336,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:64.227.101.255,}" Feb 13 16:18:02.342884 kubelet[1788]: I0213 16:18:02.342576 1788 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 16:18:02.346168 kubelet[1788]: I0213 16:18:02.346137 1788 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 16:18:02.350120 kubelet[1788]: I0213 16:18:02.350077 1788 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 16:18:02.350352 kubelet[1788]: I0213 16:18:02.350337 1788 kubelet.go:2329] "Starting kubelet main sync loop" Feb 13 16:18:02.350504 kubelet[1788]: E0213 16:18:02.350494 1788 kubelet.go:2353] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Feb 13 16:18:02.432695 kubelet[1788]: E0213 16:18:02.432641 1788 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"64.227.101.255\" not found" node="64.227.101.255" Feb 13 16:18:02.511874 kubelet[1788]: I0213 16:18:02.511661 1788 kubelet_node_status.go:73] "Attempting to register node" node="64.227.101.255" Feb 13 16:18:02.548335 kubelet[1788]: I0213 16:18:02.548098 1788 kubelet_node_status.go:76] "Successfully registered node" node="64.227.101.255" Feb 13 16:18:02.661398 kubelet[1788]: E0213 16:18:02.661346 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:02.761775 kubelet[1788]: E0213 16:18:02.761682 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:02.862448 kubelet[1788]: E0213 16:18:02.862219 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:02.963066 kubelet[1788]: E0213 16:18:02.962991 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:03.039625 kubelet[1788]: I0213 16:18:03.039529 1788 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 13 16:18:03.039836 kubelet[1788]: W0213 16:18:03.039821 1788 reflector.go:462] vendor/k8s.io/client-go/informers/factory.go:159: watch of *v1.RuntimeClass ended with: very short watch: vendor/k8s.io/client-go/informers/factory.go:159: Unexpected watch close - watch lasted less than a second and no items received Feb 13 16:18:03.064307 kubelet[1788]: E0213 16:18:03.064216 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:03.112429 kubelet[1788]: E0213 16:18:03.112363 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:03.164940 kubelet[1788]: E0213 16:18:03.164750 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:03.265186 kubelet[1788]: E0213 16:18:03.265114 1788 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"64.227.101.255\" not found" Feb 13 16:18:03.367373 kubelet[1788]: I0213 16:18:03.367067 1788 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.1.0/24" Feb 13 16:18:03.367806 containerd[1474]: time="2025-02-13T16:18:03.367740890Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 16:18:03.368436 kubelet[1788]: I0213 16:18:03.368275 1788 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.1.0/24" Feb 13 16:18:03.835572 sudo[1659]: pam_unix(sudo:session): session closed for user root Feb 13 16:18:03.843683 sshd[1658]: Connection closed by 139.178.89.65 port 34878 Feb 13 16:18:03.844073 sshd-session[1656]: pam_unix(sshd:session): session closed for user core Feb 13 16:18:03.851090 systemd[1]: sshd@7-64.227.101.255:22-139.178.89.65:34878.service: Deactivated successfully. Feb 13 16:18:03.853043 systemd-logind[1453]: Session 7 logged out. Waiting for processes to exit. Feb 13 16:18:03.858497 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 16:18:03.864678 systemd-logind[1453]: Removed session 7. Feb 13 16:18:04.110297 kubelet[1788]: I0213 16:18:04.109824 1788 apiserver.go:52] "Watching apiserver" Feb 13 16:18:04.113279 kubelet[1788]: E0213 16:18:04.113170 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:04.138864 kubelet[1788]: I0213 16:18:04.138793 1788 topology_manager.go:215] "Topology Admit Handler" podUID="ce202485-6d0f-47ce-8917-54c286d3eb4b" podNamespace="calico-system" podName="calico-node-hpnmv" Feb 13 16:18:04.139420 kubelet[1788]: I0213 16:18:04.138996 1788 topology_manager.go:215] "Topology Admit Handler" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" podNamespace="calico-system" podName="csi-node-driver-jlrrr" Feb 13 16:18:04.139420 kubelet[1788]: I0213 16:18:04.139086 1788 topology_manager.go:215] "Topology Admit Handler" podUID="0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b" podNamespace="kube-system" podName="kube-proxy-28l57" Feb 13 16:18:04.141794 kubelet[1788]: E0213 16:18:04.140031 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:04.153485 systemd[1]: Created slice kubepods-besteffort-pod0bcc904a_cb83_4ab1_99a2_5c1a7f773d2b.slice - libcontainer container kubepods-besteffort-pod0bcc904a_cb83_4ab1_99a2_5c1a7f773d2b.slice. Feb 13 16:18:04.161072 kubelet[1788]: I0213 16:18:04.160975 1788 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Feb 13 16:18:04.175716 kubelet[1788]: I0213 16:18:04.174735 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-lib-calico\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.175716 kubelet[1788]: I0213 16:18:04.174815 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-bin-dir\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.175716 kubelet[1788]: I0213 16:18:04.174866 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-log-dir\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.175716 kubelet[1788]: I0213 16:18:04.174907 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-flexvol-driver-host\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.175716 kubelet[1788]: I0213 16:18:04.174949 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b-kube-proxy\") pod \"kube-proxy-28l57\" (UID: \"0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b\") " pod="kube-system/kube-proxy-28l57" Feb 13 16:18:04.176056 kubelet[1788]: I0213 16:18:04.174981 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce202485-6d0f-47ce-8917-54c286d3eb4b-tigera-ca-bundle\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176056 kubelet[1788]: I0213 16:18:04.175014 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-run-calico\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176056 kubelet[1788]: I0213 16:18:04.175046 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/63a56fc8-68aa-4d63-8400-3078bd2ff61f-varrun\") pod \"csi-node-driver-jlrrr\" (UID: \"63a56fc8-68aa-4d63-8400-3078bd2ff61f\") " pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:04.176056 kubelet[1788]: I0213 16:18:04.175088 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63a56fc8-68aa-4d63-8400-3078bd2ff61f-socket-dir\") pod \"csi-node-driver-jlrrr\" (UID: \"63a56fc8-68aa-4d63-8400-3078bd2ff61f\") " pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:04.176056 kubelet[1788]: I0213 16:18:04.175123 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-net-dir\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176279 kubelet[1788]: I0213 16:18:04.175179 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jglp2\" (UniqueName: \"kubernetes.io/projected/ce202485-6d0f-47ce-8917-54c286d3eb4b-kube-api-access-jglp2\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176279 kubelet[1788]: I0213 16:18:04.175263 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ce202485-6d0f-47ce-8917-54c286d3eb4b-node-certs\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176279 kubelet[1788]: I0213 16:18:04.175358 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63a56fc8-68aa-4d63-8400-3078bd2ff61f-registration-dir\") pod \"csi-node-driver-jlrrr\" (UID: \"63a56fc8-68aa-4d63-8400-3078bd2ff61f\") " pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:04.176279 kubelet[1788]: I0213 16:18:04.175407 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b-xtables-lock\") pod \"kube-proxy-28l57\" (UID: \"0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b\") " pod="kube-system/kube-proxy-28l57" Feb 13 16:18:04.176279 kubelet[1788]: I0213 16:18:04.175443 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-lib-modules\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176480 kubelet[1788]: I0213 16:18:04.175487 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-policysync\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176480 kubelet[1788]: I0213 16:18:04.175525 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5zf6\" (UniqueName: \"kubernetes.io/projected/63a56fc8-68aa-4d63-8400-3078bd2ff61f-kube-api-access-m5zf6\") pod \"csi-node-driver-jlrrr\" (UID: \"63a56fc8-68aa-4d63-8400-3078bd2ff61f\") " pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:04.176480 kubelet[1788]: I0213 16:18:04.175567 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b-lib-modules\") pod \"kube-proxy-28l57\" (UID: \"0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b\") " pod="kube-system/kube-proxy-28l57" Feb 13 16:18:04.176480 kubelet[1788]: I0213 16:18:04.175602 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjldd\" (UniqueName: \"kubernetes.io/projected/0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b-kube-api-access-pjldd\") pod \"kube-proxy-28l57\" (UID: \"0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b\") " pod="kube-system/kube-proxy-28l57" Feb 13 16:18:04.176480 kubelet[1788]: I0213 16:18:04.175645 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-xtables-lock\") pod \"calico-node-hpnmv\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " pod="calico-system/calico-node-hpnmv" Feb 13 16:18:04.176685 kubelet[1788]: I0213 16:18:04.175681 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63a56fc8-68aa-4d63-8400-3078bd2ff61f-kubelet-dir\") pod \"csi-node-driver-jlrrr\" (UID: \"63a56fc8-68aa-4d63-8400-3078bd2ff61f\") " pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:04.176789 systemd[1]: Created slice kubepods-besteffort-podce202485_6d0f_47ce_8917_54c286d3eb4b.slice - libcontainer container kubepods-besteffort-podce202485_6d0f_47ce_8917_54c286d3eb4b.slice. Feb 13 16:18:04.281319 kubelet[1788]: E0213 16:18:04.281276 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.281686 kubelet[1788]: W0213 16:18:04.281490 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.281686 kubelet[1788]: E0213 16:18:04.281568 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.281970 kubelet[1788]: E0213 16:18:04.281914 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.281970 kubelet[1788]: W0213 16:18:04.281932 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.282120 kubelet[1788]: E0213 16:18:04.282018 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.282120 kubelet[1788]: E0213 16:18:04.282117 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.282210 kubelet[1788]: W0213 16:18:04.282124 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.282210 kubelet[1788]: E0213 16:18:04.282144 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.282351 kubelet[1788]: E0213 16:18:04.282327 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.282351 kubelet[1788]: W0213 16:18:04.282339 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.282351 kubelet[1788]: E0213 16:18:04.282351 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.282750 kubelet[1788]: E0213 16:18:04.282728 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.282750 kubelet[1788]: W0213 16:18:04.282743 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.282974 kubelet[1788]: E0213 16:18:04.282769 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.283204 kubelet[1788]: E0213 16:18:04.283186 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.283515 kubelet[1788]: W0213 16:18:04.283366 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.283515 kubelet[1788]: E0213 16:18:04.283412 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.283845 kubelet[1788]: E0213 16:18:04.283830 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.283921 kubelet[1788]: W0213 16:18:04.283908 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.284157 kubelet[1788]: E0213 16:18:04.283992 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.284444 kubelet[1788]: E0213 16:18:04.284422 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.284444 kubelet[1788]: W0213 16:18:04.284441 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.284555 kubelet[1788]: E0213 16:18:04.284473 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.295533 kubelet[1788]: E0213 16:18:04.294322 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.296217 kubelet[1788]: W0213 16:18:04.295920 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.296217 kubelet[1788]: E0213 16:18:04.296004 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.296556 kubelet[1788]: E0213 16:18:04.296454 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.296556 kubelet[1788]: W0213 16:18:04.296470 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.296556 kubelet[1788]: E0213 16:18:04.296504 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.315713 kubelet[1788]: E0213 16:18:04.315669 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.315713 kubelet[1788]: W0213 16:18:04.315704 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.315958 kubelet[1788]: E0213 16:18:04.315753 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.321088 kubelet[1788]: E0213 16:18:04.316906 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.321088 kubelet[1788]: W0213 16:18:04.316941 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.321088 kubelet[1788]: E0213 16:18:04.316974 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.335634 kubelet[1788]: E0213 16:18:04.335590 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:04.335634 kubelet[1788]: W0213 16:18:04.335626 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:04.335861 kubelet[1788]: E0213 16:18:04.335667 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:04.469943 kubelet[1788]: E0213 16:18:04.468765 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:04.471785 containerd[1474]: time="2025-02-13T16:18:04.471721047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-28l57,Uid:0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b,Namespace:kube-system,Attempt:0,}" Feb 13 16:18:04.483077 kubelet[1788]: E0213 16:18:04.482636 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:04.484037 containerd[1474]: time="2025-02-13T16:18:04.483948792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hpnmv,Uid:ce202485-6d0f-47ce-8917-54c286d3eb4b,Namespace:calico-system,Attempt:0,}" Feb 13 16:18:05.113851 kubelet[1788]: E0213 16:18:05.113781 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:05.239557 containerd[1474]: time="2025-02-13T16:18:05.237416021Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:18:05.239690 containerd[1474]: time="2025-02-13T16:18:05.239549713Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:18:05.241831 containerd[1474]: time="2025-02-13T16:18:05.241759863Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 16:18:05.241978 containerd[1474]: time="2025-02-13T16:18:05.241815454Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 16:18:05.242983 containerd[1474]: time="2025-02-13T16:18:05.242914970Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:18:05.248420 containerd[1474]: time="2025-02-13T16:18:05.248202349Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:18:05.251393 containerd[1474]: time="2025-02-13T16:18:05.250284507Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 766.096874ms" Feb 13 16:18:05.253616 containerd[1474]: time="2025-02-13T16:18:05.253523915Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 781.609278ms" Feb 13 16:18:05.291163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3912177058.mount: Deactivated successfully. Feb 13 16:18:05.445356 containerd[1474]: time="2025-02-13T16:18:05.445040854Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:05.445674 containerd[1474]: time="2025-02-13T16:18:05.445219065Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:05.446136 containerd[1474]: time="2025-02-13T16:18:05.445811559Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:05.447869 containerd[1474]: time="2025-02-13T16:18:05.447738631Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:05.448158 containerd[1474]: time="2025-02-13T16:18:05.447834918Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:05.448158 containerd[1474]: time="2025-02-13T16:18:05.447878164Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:05.448158 containerd[1474]: time="2025-02-13T16:18:05.447990372Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:05.450123 containerd[1474]: time="2025-02-13T16:18:05.447037534Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:05.637297 systemd[1]: Started cri-containerd-c76c4d0752e89b64174f3ea870867722052a2dce15c721a05a771ee9de60be1d.scope - libcontainer container c76c4d0752e89b64174f3ea870867722052a2dce15c721a05a771ee9de60be1d. Feb 13 16:18:05.649616 systemd[1]: Started cri-containerd-396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8.scope - libcontainer container 396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8. Feb 13 16:18:05.708039 containerd[1474]: time="2025-02-13T16:18:05.707547396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-28l57,Uid:0bcc904a-cb83-4ab1-99a2-5c1a7f773d2b,Namespace:kube-system,Attempt:0,} returns sandbox id \"c76c4d0752e89b64174f3ea870867722052a2dce15c721a05a771ee9de60be1d\"" Feb 13 16:18:05.714245 kubelet[1788]: E0213 16:18:05.714198 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:05.716697 containerd[1474]: time="2025-02-13T16:18:05.716644735Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.14\"" Feb 13 16:18:05.724696 containerd[1474]: time="2025-02-13T16:18:05.724614844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hpnmv,Uid:ce202485-6d0f-47ce-8917-54c286d3eb4b,Namespace:calico-system,Attempt:0,} returns sandbox id \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\"" Feb 13 16:18:05.728025 kubelet[1788]: E0213 16:18:05.727963 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:06.115265 kubelet[1788]: E0213 16:18:06.115009 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:06.354268 kubelet[1788]: E0213 16:18:06.352695 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:07.116087 kubelet[1788]: E0213 16:18:07.116024 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:07.268292 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1208651812.mount: Deactivated successfully. Feb 13 16:18:08.117424 kubelet[1788]: E0213 16:18:08.117243 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:08.351797 kubelet[1788]: E0213 16:18:08.351738 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:08.549284 containerd[1474]: time="2025-02-13T16:18:08.549181116Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.14\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:08.550896 containerd[1474]: time="2025-02-13T16:18:08.550793024Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.14: active requests=0, bytes read=28620592" Feb 13 16:18:08.552067 containerd[1474]: time="2025-02-13T16:18:08.551853895Z" level=info msg="ImageCreate event name:\"sha256:609f2866f1e52a5f0d2651e1206db6aeb38e8c3f91175abcfaf7e87381e5cce2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:08.555184 containerd[1474]: time="2025-02-13T16:18:08.555101775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:197988595a902751e4e570a5e4d74182f12d83c1d175c1e79aa020f358f6535b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:08.556476 containerd[1474]: time="2025-02-13T16:18:08.556414666Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.14\" with image id \"sha256:609f2866f1e52a5f0d2651e1206db6aeb38e8c3f91175abcfaf7e87381e5cce2\", repo tag \"registry.k8s.io/kube-proxy:v1.29.14\", repo digest \"registry.k8s.io/kube-proxy@sha256:197988595a902751e4e570a5e4d74182f12d83c1d175c1e79aa020f358f6535b\", size \"28619611\" in 2.839347253s" Feb 13 16:18:08.556476 containerd[1474]: time="2025-02-13T16:18:08.556481578Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.14\" returns image reference \"sha256:609f2866f1e52a5f0d2651e1206db6aeb38e8c3f91175abcfaf7e87381e5cce2\"" Feb 13 16:18:08.558632 containerd[1474]: time="2025-02-13T16:18:08.558585653Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 16:18:08.560881 containerd[1474]: time="2025-02-13T16:18:08.560831242Z" level=info msg="CreateContainer within sandbox \"c76c4d0752e89b64174f3ea870867722052a2dce15c721a05a771ee9de60be1d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 16:18:08.589825 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1082670206.mount: Deactivated successfully. Feb 13 16:18:08.594433 containerd[1474]: time="2025-02-13T16:18:08.594355391Z" level=info msg="CreateContainer within sandbox \"c76c4d0752e89b64174f3ea870867722052a2dce15c721a05a771ee9de60be1d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5914d483b8018484e7e611361a69d80b964ed76fa94947f1148f077436560fb3\"" Feb 13 16:18:08.595900 containerd[1474]: time="2025-02-13T16:18:08.595840730Z" level=info msg="StartContainer for \"5914d483b8018484e7e611361a69d80b964ed76fa94947f1148f077436560fb3\"" Feb 13 16:18:08.669735 systemd[1]: run-containerd-runc-k8s.io-5914d483b8018484e7e611361a69d80b964ed76fa94947f1148f077436560fb3-runc.7XI76t.mount: Deactivated successfully. Feb 13 16:18:08.681387 systemd[1]: Started cri-containerd-5914d483b8018484e7e611361a69d80b964ed76fa94947f1148f077436560fb3.scope - libcontainer container 5914d483b8018484e7e611361a69d80b964ed76fa94947f1148f077436560fb3. Feb 13 16:18:08.776695 containerd[1474]: time="2025-02-13T16:18:08.776616757Z" level=info msg="StartContainer for \"5914d483b8018484e7e611361a69d80b964ed76fa94947f1148f077436560fb3\" returns successfully" Feb 13 16:18:09.118017 kubelet[1788]: E0213 16:18:09.117804 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:09.426967 kubelet[1788]: E0213 16:18:09.425532 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:09.532642 kubelet[1788]: E0213 16:18:09.526944 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.532642 kubelet[1788]: W0213 16:18:09.526980 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.532642 kubelet[1788]: E0213 16:18:09.527022 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.532642 kubelet[1788]: E0213 16:18:09.527445 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.532642 kubelet[1788]: W0213 16:18:09.527478 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.532642 kubelet[1788]: E0213 16:18:09.527505 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.532642 kubelet[1788]: E0213 16:18:09.527800 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.532642 kubelet[1788]: W0213 16:18:09.527810 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.532642 kubelet[1788]: E0213 16:18:09.527846 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.532642 kubelet[1788]: E0213 16:18:09.528098 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.533189 kubelet[1788]: W0213 16:18:09.528110 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.533189 kubelet[1788]: E0213 16:18:09.528148 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.533189 kubelet[1788]: E0213 16:18:09.528436 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.533189 kubelet[1788]: W0213 16:18:09.528464 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.533189 kubelet[1788]: E0213 16:18:09.528481 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.533189 kubelet[1788]: E0213 16:18:09.528703 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.533189 kubelet[1788]: W0213 16:18:09.528713 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.533189 kubelet[1788]: E0213 16:18:09.528732 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.533189 kubelet[1788]: E0213 16:18:09.528954 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.533189 kubelet[1788]: W0213 16:18:09.528964 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.533696 kubelet[1788]: E0213 16:18:09.528997 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.533696 kubelet[1788]: E0213 16:18:09.529265 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.533696 kubelet[1788]: W0213 16:18:09.529276 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.533696 kubelet[1788]: E0213 16:18:09.529295 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.533696 kubelet[1788]: E0213 16:18:09.529585 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.533696 kubelet[1788]: W0213 16:18:09.529597 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.533696 kubelet[1788]: E0213 16:18:09.529626 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.533696 kubelet[1788]: E0213 16:18:09.529851 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.533696 kubelet[1788]: W0213 16:18:09.529862 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.533696 kubelet[1788]: E0213 16:18:09.529898 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.534107 kubelet[1788]: E0213 16:18:09.530128 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.534107 kubelet[1788]: W0213 16:18:09.530138 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.534107 kubelet[1788]: E0213 16:18:09.530153 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.534107 kubelet[1788]: E0213 16:18:09.530386 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.534107 kubelet[1788]: W0213 16:18:09.530396 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.534107 kubelet[1788]: E0213 16:18:09.530412 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.534107 kubelet[1788]: E0213 16:18:09.530658 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.534107 kubelet[1788]: W0213 16:18:09.530668 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.534107 kubelet[1788]: E0213 16:18:09.530685 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.534107 kubelet[1788]: E0213 16:18:09.530903 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.534815 kubelet[1788]: W0213 16:18:09.530914 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.534815 kubelet[1788]: E0213 16:18:09.530930 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.534815 kubelet[1788]: E0213 16:18:09.531151 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.534815 kubelet[1788]: W0213 16:18:09.531164 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.534815 kubelet[1788]: E0213 16:18:09.531183 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.534815 kubelet[1788]: E0213 16:18:09.531468 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.534815 kubelet[1788]: W0213 16:18:09.531480 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.534815 kubelet[1788]: E0213 16:18:09.531498 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.534815 kubelet[1788]: E0213 16:18:09.531708 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.534815 kubelet[1788]: W0213 16:18:09.531718 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.535188 kubelet[1788]: E0213 16:18:09.531735 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.535188 kubelet[1788]: E0213 16:18:09.531963 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.535188 kubelet[1788]: W0213 16:18:09.531974 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.535188 kubelet[1788]: E0213 16:18:09.531991 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.535188 kubelet[1788]: E0213 16:18:09.532214 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.535188 kubelet[1788]: W0213 16:18:09.532256 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.535188 kubelet[1788]: E0213 16:18:09.532277 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.535188 kubelet[1788]: E0213 16:18:09.532474 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.535188 kubelet[1788]: W0213 16:18:09.532484 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.535188 kubelet[1788]: E0213 16:18:09.532499 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.548629 kubelet[1788]: E0213 16:18:09.548587 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.550689 kubelet[1788]: W0213 16:18:09.548837 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.550689 kubelet[1788]: E0213 16:18:09.548880 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.550689 kubelet[1788]: E0213 16:18:09.549376 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.550689 kubelet[1788]: W0213 16:18:09.549393 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.550689 kubelet[1788]: E0213 16:18:09.549419 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.550689 kubelet[1788]: E0213 16:18:09.549624 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.550689 kubelet[1788]: W0213 16:18:09.549635 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.550689 kubelet[1788]: E0213 16:18:09.549651 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.550689 kubelet[1788]: E0213 16:18:09.549815 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.550689 kubelet[1788]: W0213 16:18:09.549824 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.551166 kubelet[1788]: E0213 16:18:09.549839 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.551166 kubelet[1788]: E0213 16:18:09.550065 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.551166 kubelet[1788]: W0213 16:18:09.550074 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.551166 kubelet[1788]: E0213 16:18:09.550089 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.551166 kubelet[1788]: E0213 16:18:09.550580 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.551166 kubelet[1788]: W0213 16:18:09.550593 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.551166 kubelet[1788]: E0213 16:18:09.550610 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.553819 kubelet[1788]: E0213 16:18:09.552152 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.553819 kubelet[1788]: W0213 16:18:09.552177 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.553819 kubelet[1788]: E0213 16:18:09.552199 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.553819 kubelet[1788]: E0213 16:18:09.552510 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.553819 kubelet[1788]: W0213 16:18:09.552521 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.553819 kubelet[1788]: E0213 16:18:09.552537 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.553819 kubelet[1788]: E0213 16:18:09.552784 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.553819 kubelet[1788]: W0213 16:18:09.552794 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.553819 kubelet[1788]: E0213 16:18:09.552816 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.553819 kubelet[1788]: E0213 16:18:09.552985 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.554300 kubelet[1788]: W0213 16:18:09.552995 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.554300 kubelet[1788]: E0213 16:18:09.553015 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.554300 kubelet[1788]: E0213 16:18:09.553221 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.554300 kubelet[1788]: W0213 16:18:09.553245 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.554300 kubelet[1788]: E0213 16:18:09.553261 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:09.554300 kubelet[1788]: E0213 16:18:09.553664 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:09.554300 kubelet[1788]: W0213 16:18:09.553674 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:09.554300 kubelet[1788]: E0213 16:18:09.553691 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.118415 kubelet[1788]: E0213 16:18:10.118210 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:10.147125 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3286022738.mount: Deactivated successfully. Feb 13 16:18:10.353466 kubelet[1788]: E0213 16:18:10.353386 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:10.402831 containerd[1474]: time="2025-02-13T16:18:10.401796461Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:10.402831 containerd[1474]: time="2025-02-13T16:18:10.402273346Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=6855343" Feb 13 16:18:10.405493 containerd[1474]: time="2025-02-13T16:18:10.405431282Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:10.411758 containerd[1474]: time="2025-02-13T16:18:10.411615468Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:10.413253 containerd[1474]: time="2025-02-13T16:18:10.412736610Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.853541586s" Feb 13 16:18:10.413253 containerd[1474]: time="2025-02-13T16:18:10.412814022Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 16:18:10.416903 containerd[1474]: time="2025-02-13T16:18:10.416715688Z" level=info msg="CreateContainer within sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 16:18:10.433678 kubelet[1788]: E0213 16:18:10.433031 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:10.447736 kubelet[1788]: E0213 16:18:10.447669 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.448162 kubelet[1788]: W0213 16:18:10.447931 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.448162 kubelet[1788]: E0213 16:18:10.448000 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.448731 kubelet[1788]: E0213 16:18:10.448611 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.448731 kubelet[1788]: W0213 16:18:10.448633 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.448731 kubelet[1788]: E0213 16:18:10.448669 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.450331 kubelet[1788]: E0213 16:18:10.449559 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.450331 kubelet[1788]: W0213 16:18:10.449580 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.450331 kubelet[1788]: E0213 16:18:10.449616 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.450732 containerd[1474]: time="2025-02-13T16:18:10.450077953Z" level=info msg="CreateContainer within sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6\"" Feb 13 16:18:10.451149 containerd[1474]: time="2025-02-13T16:18:10.451117456Z" level=info msg="StartContainer for \"e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6\"" Feb 13 16:18:10.452377 kubelet[1788]: E0213 16:18:10.451285 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.452377 kubelet[1788]: W0213 16:18:10.451316 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.452377 kubelet[1788]: E0213 16:18:10.451342 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.452844 kubelet[1788]: E0213 16:18:10.452819 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.452844 kubelet[1788]: W0213 16:18:10.452842 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.452931 kubelet[1788]: E0213 16:18:10.452863 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.454732 kubelet[1788]: E0213 16:18:10.453410 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.454732 kubelet[1788]: W0213 16:18:10.453421 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.454732 kubelet[1788]: E0213 16:18:10.453435 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.457678 kubelet[1788]: E0213 16:18:10.455642 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.457678 kubelet[1788]: W0213 16:18:10.455665 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.457678 kubelet[1788]: E0213 16:18:10.455690 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.458324 kubelet[1788]: E0213 16:18:10.458292 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.458324 kubelet[1788]: W0213 16:18:10.458322 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.459012 kubelet[1788]: E0213 16:18:10.458349 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.459012 kubelet[1788]: E0213 16:18:10.458657 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.459012 kubelet[1788]: W0213 16:18:10.458671 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.459012 kubelet[1788]: E0213 16:18:10.458687 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.459012 kubelet[1788]: E0213 16:18:10.458925 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.459012 kubelet[1788]: W0213 16:18:10.458937 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.459012 kubelet[1788]: E0213 16:18:10.458957 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.460070 kubelet[1788]: E0213 16:18:10.459165 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.460070 kubelet[1788]: W0213 16:18:10.459181 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.460070 kubelet[1788]: E0213 16:18:10.459198 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.460070 kubelet[1788]: E0213 16:18:10.459446 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.460070 kubelet[1788]: W0213 16:18:10.459459 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.460070 kubelet[1788]: E0213 16:18:10.459478 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.460070 kubelet[1788]: E0213 16:18:10.459698 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.460070 kubelet[1788]: W0213 16:18:10.459710 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.460070 kubelet[1788]: E0213 16:18:10.459727 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.460070 kubelet[1788]: E0213 16:18:10.459937 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.460897 kubelet[1788]: W0213 16:18:10.459948 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.460897 kubelet[1788]: E0213 16:18:10.459964 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.460897 kubelet[1788]: E0213 16:18:10.460196 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.460897 kubelet[1788]: W0213 16:18:10.460209 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.460897 kubelet[1788]: E0213 16:18:10.460251 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.460897 kubelet[1788]: E0213 16:18:10.460503 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.460897 kubelet[1788]: W0213 16:18:10.460514 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.460897 kubelet[1788]: E0213 16:18:10.460531 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.460897 kubelet[1788]: E0213 16:18:10.460747 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.460897 kubelet[1788]: W0213 16:18:10.460759 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.461664 kubelet[1788]: E0213 16:18:10.460779 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.461664 kubelet[1788]: E0213 16:18:10.460981 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.461664 kubelet[1788]: W0213 16:18:10.460992 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.461664 kubelet[1788]: E0213 16:18:10.461010 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.461664 kubelet[1788]: E0213 16:18:10.461366 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.461664 kubelet[1788]: W0213 16:18:10.461381 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.461664 kubelet[1788]: E0213 16:18:10.461404 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.461664 kubelet[1788]: E0213 16:18:10.461621 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.461664 kubelet[1788]: W0213 16:18:10.461632 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.461664 kubelet[1788]: E0213 16:18:10.461650 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.462615 kubelet[1788]: E0213 16:18:10.462048 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.462615 kubelet[1788]: W0213 16:18:10.462061 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.462615 kubelet[1788]: E0213 16:18:10.462082 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.462615 kubelet[1788]: E0213 16:18:10.462461 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.462615 kubelet[1788]: W0213 16:18:10.462474 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.462615 kubelet[1788]: E0213 16:18:10.462512 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.464350 kubelet[1788]: E0213 16:18:10.463846 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.464350 kubelet[1788]: W0213 16:18:10.463874 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.464350 kubelet[1788]: E0213 16:18:10.463907 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.465853 kubelet[1788]: E0213 16:18:10.464612 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.465853 kubelet[1788]: W0213 16:18:10.464627 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.465853 kubelet[1788]: E0213 16:18:10.464679 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.466034 kubelet[1788]: E0213 16:18:10.465870 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.466034 kubelet[1788]: W0213 16:18:10.465892 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.466034 kubelet[1788]: E0213 16:18:10.465923 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.468782 kubelet[1788]: E0213 16:18:10.468743 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.468782 kubelet[1788]: W0213 16:18:10.468779 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.469331 kubelet[1788]: E0213 16:18:10.468982 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.469331 kubelet[1788]: E0213 16:18:10.469269 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.469331 kubelet[1788]: W0213 16:18:10.469286 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.469331 kubelet[1788]: E0213 16:18:10.469317 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.470356 kubelet[1788]: E0213 16:18:10.469768 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.470356 kubelet[1788]: W0213 16:18:10.469785 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.470356 kubelet[1788]: E0213 16:18:10.469816 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.470356 kubelet[1788]: E0213 16:18:10.470201 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.470356 kubelet[1788]: W0213 16:18:10.470217 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.470356 kubelet[1788]: E0213 16:18:10.470255 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.473513 kubelet[1788]: E0213 16:18:10.473072 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.473513 kubelet[1788]: W0213 16:18:10.473112 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.473513 kubelet[1788]: E0213 16:18:10.473171 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.475610 kubelet[1788]: E0213 16:18:10.474832 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.475610 kubelet[1788]: W0213 16:18:10.474870 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.475610 kubelet[1788]: E0213 16:18:10.474913 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.476772 kubelet[1788]: E0213 16:18:10.476641 1788 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:18:10.476772 kubelet[1788]: W0213 16:18:10.476669 1788 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:18:10.476772 kubelet[1788]: E0213 16:18:10.476696 1788 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:18:10.530459 systemd[1]: Started cri-containerd-e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6.scope - libcontainer container e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6. Feb 13 16:18:10.598455 containerd[1474]: time="2025-02-13T16:18:10.597695324Z" level=info msg="StartContainer for \"e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6\" returns successfully" Feb 13 16:18:10.623405 systemd[1]: cri-containerd-e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6.scope: Deactivated successfully. Feb 13 16:18:10.638785 systemd-resolved[1327]: Using degraded feature set UDP instead of UDP+EDNS0 for DNS server 67.207.67.3. Feb 13 16:18:10.771089 containerd[1474]: time="2025-02-13T16:18:10.770744320Z" level=info msg="shim disconnected" id=e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6 namespace=k8s.io Feb 13 16:18:10.771089 containerd[1474]: time="2025-02-13T16:18:10.770839852Z" level=warning msg="cleaning up after shim disconnected" id=e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6 namespace=k8s.io Feb 13 16:18:10.771089 containerd[1474]: time="2025-02-13T16:18:10.770852354Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:18:10.822345 containerd[1474]: time="2025-02-13T16:18:10.821944055Z" level=warning msg="cleanup warnings time=\"2025-02-13T16:18:10Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Feb 13 16:18:11.022104 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6-rootfs.mount: Deactivated successfully. Feb 13 16:18:11.119178 kubelet[1788]: E0213 16:18:11.119097 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:11.448050 kubelet[1788]: E0213 16:18:11.447985 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:11.449832 containerd[1474]: time="2025-02-13T16:18:11.449560401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 16:18:11.485139 kubelet[1788]: I0213 16:18:11.484984 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-28l57" podStartSLOduration=6.642738702 podStartE2EDuration="9.484729928s" podCreationTimestamp="2025-02-13 16:18:02 +0000 UTC" firstStartedPulling="2025-02-13 16:18:05.715678786 +0000 UTC m=+4.410656659" lastFinishedPulling="2025-02-13 16:18:08.557669998 +0000 UTC m=+7.252647885" observedRunningTime="2025-02-13 16:18:09.445623655 +0000 UTC m=+8.140601551" watchObservedRunningTime="2025-02-13 16:18:11.484729928 +0000 UTC m=+10.179707815" Feb 13 16:18:12.119896 kubelet[1788]: E0213 16:18:12.119836 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:12.353144 kubelet[1788]: E0213 16:18:12.352617 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:13.120143 kubelet[1788]: E0213 16:18:13.120038 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:13.884677 systemd[1]: Started sshd@8-64.227.101.255:22-160.22.195.6:55368.service - OpenSSH per-connection server daemon (160.22.195.6:55368). Feb 13 16:18:14.121189 kubelet[1788]: E0213 16:18:14.121037 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:14.354791 kubelet[1788]: E0213 16:18:14.352609 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:15.073598 sshd[2234]: Invalid user bitwarden from 160.22.195.6 port 55368 Feb 13 16:18:15.121285 kubelet[1788]: E0213 16:18:15.121158 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:15.290495 sshd[2234]: Received disconnect from 160.22.195.6 port 55368:11: Bye Bye [preauth] Feb 13 16:18:15.290495 sshd[2234]: Disconnected from invalid user bitwarden 160.22.195.6 port 55368 [preauth] Feb 13 16:18:15.295219 systemd[1]: sshd@8-64.227.101.255:22-160.22.195.6:55368.service: Deactivated successfully. Feb 13 16:18:16.122011 kubelet[1788]: E0213 16:18:16.121956 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:16.352713 kubelet[1788]: E0213 16:18:16.351819 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:17.122694 kubelet[1788]: E0213 16:18:17.122640 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:17.265680 containerd[1474]: time="2025-02-13T16:18:17.265572596Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:17.266963 containerd[1474]: time="2025-02-13T16:18:17.266872421Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 16:18:17.268476 containerd[1474]: time="2025-02-13T16:18:17.268138225Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:17.271695 containerd[1474]: time="2025-02-13T16:18:17.271640823Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:17.273910 containerd[1474]: time="2025-02-13T16:18:17.273829965Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 5.824193043s" Feb 13 16:18:17.273910 containerd[1474]: time="2025-02-13T16:18:17.273895377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 16:18:17.277604 containerd[1474]: time="2025-02-13T16:18:17.277556506Z" level=info msg="CreateContainer within sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 16:18:17.334079 containerd[1474]: time="2025-02-13T16:18:17.333999188Z" level=info msg="CreateContainer within sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc\"" Feb 13 16:18:17.336291 containerd[1474]: time="2025-02-13T16:18:17.334861353Z" level=info msg="StartContainer for \"d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc\"" Feb 13 16:18:17.411675 systemd[1]: Started cri-containerd-d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc.scope - libcontainer container d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc. Feb 13 16:18:17.479975 containerd[1474]: time="2025-02-13T16:18:17.478757693Z" level=info msg="StartContainer for \"d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc\" returns successfully" Feb 13 16:18:17.944053 systemd-resolved[1327]: Using degraded feature set TCP instead of UDP for DNS server 67.207.67.3. Feb 13 16:18:18.017687 systemd-timesyncd[1345]: Contacted time server 75.72.171.171:123 (2.flatcar.pool.ntp.org). Feb 13 16:18:18.018242 systemd-timesyncd[1345]: Initial clock synchronization to Thu 2025-02-13 16:18:17.938358 UTC. Feb 13 16:18:18.123968 kubelet[1788]: E0213 16:18:18.123782 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:18.355386 kubelet[1788]: E0213 16:18:18.352661 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:18.392896 systemd[1]: cri-containerd-d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc.scope: Deactivated successfully. Feb 13 16:18:18.444494 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc-rootfs.mount: Deactivated successfully. Feb 13 16:18:18.486081 kubelet[1788]: E0213 16:18:18.485860 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:18.503483 kubelet[1788]: I0213 16:18:18.501474 1788 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Feb 13 16:18:18.597266 containerd[1474]: time="2025-02-13T16:18:18.595746355Z" level=info msg="shim disconnected" id=d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc namespace=k8s.io Feb 13 16:18:18.597266 containerd[1474]: time="2025-02-13T16:18:18.595835167Z" level=warning msg="cleaning up after shim disconnected" id=d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc namespace=k8s.io Feb 13 16:18:18.597266 containerd[1474]: time="2025-02-13T16:18:18.595850996Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:18:19.124475 kubelet[1788]: E0213 16:18:19.124388 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:19.483669 kubelet[1788]: E0213 16:18:19.483339 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:19.484927 containerd[1474]: time="2025-02-13T16:18:19.484872936Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 16:18:20.022123 systemd[1]: Started sshd@9-64.227.101.255:22-119.202.128.28:47606.service - OpenSSH per-connection server daemon (119.202.128.28:47606). Feb 13 16:18:20.125068 kubelet[1788]: E0213 16:18:20.124924 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:20.361366 systemd[1]: Created slice kubepods-besteffort-pod63a56fc8_68aa_4d63_8400_3078bd2ff61f.slice - libcontainer container kubepods-besteffort-pod63a56fc8_68aa_4d63_8400_3078bd2ff61f.slice. Feb 13 16:18:20.366334 containerd[1474]: time="2025-02-13T16:18:20.366248839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:0,}" Feb 13 16:18:20.607360 containerd[1474]: time="2025-02-13T16:18:20.606979140Z" level=error msg="Failed to destroy network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:20.609955 containerd[1474]: time="2025-02-13T16:18:20.609642345Z" level=error msg="encountered an error cleaning up failed sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:20.609955 containerd[1474]: time="2025-02-13T16:18:20.609772839Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:20.610619 kubelet[1788]: E0213 16:18:20.610549 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:20.610783 kubelet[1788]: E0213 16:18:20.610761 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:20.610869 kubelet[1788]: E0213 16:18:20.610852 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:20.610979 kubelet[1788]: E0213 16:18:20.610956 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:20.612499 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b-shm.mount: Deactivated successfully. Feb 13 16:18:20.781103 sshd[2303]: Invalid user bot from 119.202.128.28 port 47606 Feb 13 16:18:20.920559 sshd[2303]: Received disconnect from 119.202.128.28 port 47606:11: Bye Bye [preauth] Feb 13 16:18:20.920559 sshd[2303]: Disconnected from invalid user bot 119.202.128.28 port 47606 [preauth] Feb 13 16:18:20.920364 systemd[1]: sshd@9-64.227.101.255:22-119.202.128.28:47606.service: Deactivated successfully. Feb 13 16:18:21.126301 kubelet[1788]: E0213 16:18:21.125962 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:21.489872 kubelet[1788]: I0213 16:18:21.489812 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b" Feb 13 16:18:21.492241 containerd[1474]: time="2025-02-13T16:18:21.492147334Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:21.492872 containerd[1474]: time="2025-02-13T16:18:21.492535477Z" level=info msg="Ensure that sandbox 89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b in task-service has been cleanup successfully" Feb 13 16:18:21.501268 containerd[1474]: time="2025-02-13T16:18:21.499070201Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:21.501268 containerd[1474]: time="2025-02-13T16:18:21.499133454Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:21.501268 containerd[1474]: time="2025-02-13T16:18:21.500751722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:1,}" Feb 13 16:18:21.512013 systemd[1]: run-netns-cni\x2d51e40d7f\x2dd89c\x2d5724\x2d6c85\x2d5d7fe8321920.mount: Deactivated successfully. Feb 13 16:18:21.706497 containerd[1474]: time="2025-02-13T16:18:21.706432650Z" level=error msg="Failed to destroy network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:21.709191 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23-shm.mount: Deactivated successfully. Feb 13 16:18:21.713302 containerd[1474]: time="2025-02-13T16:18:21.712214483Z" level=error msg="encountered an error cleaning up failed sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:21.713302 containerd[1474]: time="2025-02-13T16:18:21.712414901Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:21.713817 kubelet[1788]: E0213 16:18:21.712823 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:21.713817 kubelet[1788]: E0213 16:18:21.712905 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:21.713817 kubelet[1788]: E0213 16:18:21.712964 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:21.714013 kubelet[1788]: E0213 16:18:21.713098 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:21.886660 systemd[1]: Started sshd@10-64.227.101.255:22-121.142.87.218:60224.service - OpenSSH per-connection server daemon (121.142.87.218:60224). Feb 13 16:18:21.911117 kubelet[1788]: I0213 16:18:21.910467 1788 topology_manager.go:215] "Topology Admit Handler" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" podNamespace="default" podName="nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:21.926472 systemd[1]: Created slice kubepods-besteffort-podf6f14251_baf9_4c3b_8963_ab140ec1e4a0.slice - libcontainer container kubepods-besteffort-podf6f14251_baf9_4c3b_8963_ab140ec1e4a0.slice. Feb 13 16:18:22.009622 kubelet[1788]: I0213 16:18:22.008535 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9q82\" (UniqueName: \"kubernetes.io/projected/f6f14251-baf9-4c3b-8963-ab140ec1e4a0-kube-api-access-s9q82\") pod \"nginx-deployment-6d5f899847-t8l5b\" (UID: \"f6f14251-baf9-4c3b-8963-ab140ec1e4a0\") " pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:22.106854 kubelet[1788]: E0213 16:18:22.106763 1788 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:22.128272 kubelet[1788]: E0213 16:18:22.126845 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:22.244085 containerd[1474]: time="2025-02-13T16:18:22.243416511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:0,}" Feb 13 16:18:22.490117 containerd[1474]: time="2025-02-13T16:18:22.490002511Z" level=error msg="Failed to destroy network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.491540 containerd[1474]: time="2025-02-13T16:18:22.490939100Z" level=error msg="encountered an error cleaning up failed sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.491540 containerd[1474]: time="2025-02-13T16:18:22.491052892Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.491934 kubelet[1788]: E0213 16:18:22.491471 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.491934 kubelet[1788]: E0213 16:18:22.491558 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:22.491934 kubelet[1788]: E0213 16:18:22.491591 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:22.492116 kubelet[1788]: E0213 16:18:22.491704 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:22.500400 kubelet[1788]: I0213 16:18:22.499331 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23" Feb 13 16:18:22.502785 containerd[1474]: time="2025-02-13T16:18:22.502730573Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:22.504431 containerd[1474]: time="2025-02-13T16:18:22.503721245Z" level=info msg="Ensure that sandbox c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23 in task-service has been cleanup successfully" Feb 13 16:18:22.505646 containerd[1474]: time="2025-02-13T16:18:22.504882288Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:22.505646 containerd[1474]: time="2025-02-13T16:18:22.504939626Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:22.507966 containerd[1474]: time="2025-02-13T16:18:22.506800714Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:22.508362 containerd[1474]: time="2025-02-13T16:18:22.508292936Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:22.510132 containerd[1474]: time="2025-02-13T16:18:22.508533876Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:22.510768 kubelet[1788]: I0213 16:18:22.510677 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5" Feb 13 16:18:22.510898 systemd[1]: run-netns-cni\x2d3c83eb60\x2d298b\x2dc898\x2db01f\x2d2feb9e7a640e.mount: Deactivated successfully. Feb 13 16:18:22.513612 containerd[1474]: time="2025-02-13T16:18:22.511988334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:2,}" Feb 13 16:18:22.525154 containerd[1474]: time="2025-02-13T16:18:22.525093527Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:22.526273 containerd[1474]: time="2025-02-13T16:18:22.525590352Z" level=info msg="Ensure that sandbox 3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5 in task-service has been cleanup successfully" Feb 13 16:18:22.526492 containerd[1474]: time="2025-02-13T16:18:22.526440194Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:22.526575 containerd[1474]: time="2025-02-13T16:18:22.526558063Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:22.530546 containerd[1474]: time="2025-02-13T16:18:22.530486488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:1,}" Feb 13 16:18:22.532830 systemd[1]: run-netns-cni\x2d32f4c679\x2dafb7\x2d6361\x2de325\x2d5137cfaa654c.mount: Deactivated successfully. Feb 13 16:18:22.661621 sshd[2370]: Invalid user backend from 121.142.87.218 port 60224 Feb 13 16:18:22.788195 containerd[1474]: time="2025-02-13T16:18:22.786525255Z" level=error msg="Failed to destroy network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.788195 containerd[1474]: time="2025-02-13T16:18:22.787114246Z" level=error msg="encountered an error cleaning up failed sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.789285 containerd[1474]: time="2025-02-13T16:18:22.788684506Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:1,} failed, error" error="failed to setup network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.789285 containerd[1474]: time="2025-02-13T16:18:22.789077692Z" level=error msg="Failed to destroy network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.789868 containerd[1474]: time="2025-02-13T16:18:22.789825079Z" level=error msg="encountered an error cleaning up failed sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.790037 containerd[1474]: time="2025-02-13T16:18:22.790005598Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.791454 kubelet[1788]: E0213 16:18:22.790464 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.791454 kubelet[1788]: E0213 16:18:22.790550 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:22.791454 kubelet[1788]: E0213 16:18:22.790592 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:22.791768 kubelet[1788]: E0213 16:18:22.790683 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:22.791768 kubelet[1788]: E0213 16:18:22.791059 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:22.791768 kubelet[1788]: E0213 16:18:22.791099 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:22.792016 kubelet[1788]: E0213 16:18:22.791125 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:22.792016 kubelet[1788]: E0213 16:18:22.791186 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:22.792308 sshd[2370]: Received disconnect from 121.142.87.218 port 60224:11: Bye Bye [preauth] Feb 13 16:18:22.792432 sshd[2370]: Disconnected from invalid user backend 121.142.87.218 port 60224 [preauth] Feb 13 16:18:22.796142 systemd[1]: sshd@10-64.227.101.255:22-121.142.87.218:60224.service: Deactivated successfully. Feb 13 16:18:23.129701 kubelet[1788]: E0213 16:18:23.127990 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:23.502091 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624-shm.mount: Deactivated successfully. Feb 13 16:18:23.518699 kubelet[1788]: I0213 16:18:23.518542 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf" Feb 13 16:18:23.519779 containerd[1474]: time="2025-02-13T16:18:23.519733439Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:23.523266 containerd[1474]: time="2025-02-13T16:18:23.520568405Z" level=info msg="Ensure that sandbox 588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf in task-service has been cleanup successfully" Feb 13 16:18:23.527167 containerd[1474]: time="2025-02-13T16:18:23.525383024Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:23.527167 containerd[1474]: time="2025-02-13T16:18:23.525573570Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:23.526284 systemd[1]: run-netns-cni\x2db1f26855\x2dbf84\x2d8f53\x2df12c\x2d40601d65fe72.mount: Deactivated successfully. Feb 13 16:18:23.528605 containerd[1474]: time="2025-02-13T16:18:23.527476761Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:23.528605 containerd[1474]: time="2025-02-13T16:18:23.527642312Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:23.528605 containerd[1474]: time="2025-02-13T16:18:23.527659386Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:23.531699 containerd[1474]: time="2025-02-13T16:18:23.531646609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:2,}" Feb 13 16:18:23.533004 kubelet[1788]: I0213 16:18:23.532852 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624" Feb 13 16:18:23.534216 containerd[1474]: time="2025-02-13T16:18:23.533765179Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:23.539260 containerd[1474]: time="2025-02-13T16:18:23.536426340Z" level=info msg="Ensure that sandbox b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624 in task-service has been cleanup successfully" Feb 13 16:18:23.539811 containerd[1474]: time="2025-02-13T16:18:23.539658407Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:23.539887 containerd[1474]: time="2025-02-13T16:18:23.539812032Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:23.541726 systemd[1]: run-netns-cni\x2d9ccfa160\x2d754f\x2d75d0\x2dbd4f\x2dc938d19d3902.mount: Deactivated successfully. Feb 13 16:18:23.542239 containerd[1474]: time="2025-02-13T16:18:23.542028410Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:23.542239 containerd[1474]: time="2025-02-13T16:18:23.542165030Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:23.542239 containerd[1474]: time="2025-02-13T16:18:23.542184081Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:23.545292 containerd[1474]: time="2025-02-13T16:18:23.542798187Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:23.545292 containerd[1474]: time="2025-02-13T16:18:23.542988533Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:23.545292 containerd[1474]: time="2025-02-13T16:18:23.543008472Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:23.548658 containerd[1474]: time="2025-02-13T16:18:23.547998190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:3,}" Feb 13 16:18:23.813881 containerd[1474]: time="2025-02-13T16:18:23.813100036Z" level=error msg="Failed to destroy network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.817014 containerd[1474]: time="2025-02-13T16:18:23.816326555Z" level=error msg="encountered an error cleaning up failed sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.817014 containerd[1474]: time="2025-02-13T16:18:23.816455088Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.817246 kubelet[1788]: E0213 16:18:23.816812 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.817246 kubelet[1788]: E0213 16:18:23.816890 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:23.817246 kubelet[1788]: E0213 16:18:23.816923 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:23.817672 kubelet[1788]: E0213 16:18:23.817017 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:23.831761 containerd[1474]: time="2025-02-13T16:18:23.831396106Z" level=error msg="Failed to destroy network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.832185 containerd[1474]: time="2025-02-13T16:18:23.832095260Z" level=error msg="encountered an error cleaning up failed sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.832511 containerd[1474]: time="2025-02-13T16:18:23.832293949Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:2,} failed, error" error="failed to setup network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.832806 kubelet[1788]: E0213 16:18:23.832760 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:23.832896 kubelet[1788]: E0213 16:18:23.832859 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:23.833031 kubelet[1788]: E0213 16:18:23.832915 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:23.833066 kubelet[1788]: E0213 16:18:23.833035 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:24.130029 kubelet[1788]: E0213 16:18:24.128987 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:24.498861 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6-shm.mount: Deactivated successfully. Feb 13 16:18:24.545749 kubelet[1788]: I0213 16:18:24.545595 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c" Feb 13 16:18:24.547955 containerd[1474]: time="2025-02-13T16:18:24.547871796Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:24.548466 containerd[1474]: time="2025-02-13T16:18:24.548316127Z" level=info msg="Ensure that sandbox e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c in task-service has been cleanup successfully" Feb 13 16:18:24.554045 containerd[1474]: time="2025-02-13T16:18:24.549504813Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:24.554045 containerd[1474]: time="2025-02-13T16:18:24.552248388Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:24.554331 containerd[1474]: time="2025-02-13T16:18:24.554301253Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:24.554671 containerd[1474]: time="2025-02-13T16:18:24.554451964Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:24.554671 containerd[1474]: time="2025-02-13T16:18:24.554477530Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:24.555045 systemd[1]: run-netns-cni\x2dd2c64e42\x2d2826\x2d2bbc\x2d6400\x2d66f09e77a352.mount: Deactivated successfully. Feb 13 16:18:24.556916 containerd[1474]: time="2025-02-13T16:18:24.556106253Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:24.556916 containerd[1474]: time="2025-02-13T16:18:24.556276970Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:24.556916 containerd[1474]: time="2025-02-13T16:18:24.556774179Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:24.559438 containerd[1474]: time="2025-02-13T16:18:24.558951886Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:24.559575 kubelet[1788]: I0213 16:18:24.559182 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6" Feb 13 16:18:24.559636 containerd[1474]: time="2025-02-13T16:18:24.559558960Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:24.559636 containerd[1474]: time="2025-02-13T16:18:24.559585602Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:24.561633 containerd[1474]: time="2025-02-13T16:18:24.561525478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:4,}" Feb 13 16:18:24.567850 containerd[1474]: time="2025-02-13T16:18:24.567272826Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:24.568367 containerd[1474]: time="2025-02-13T16:18:24.568173331Z" level=info msg="Ensure that sandbox b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6 in task-service has been cleanup successfully" Feb 13 16:18:24.573926 containerd[1474]: time="2025-02-13T16:18:24.573664719Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:24.574290 systemd[1]: run-netns-cni\x2d6d345093\x2d8deb\x2d382a\x2d7fed\x2d1194dfcc78fc.mount: Deactivated successfully. Feb 13 16:18:24.576727 containerd[1474]: time="2025-02-13T16:18:24.575384644Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:24.576727 containerd[1474]: time="2025-02-13T16:18:24.576056294Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:24.576727 containerd[1474]: time="2025-02-13T16:18:24.576197711Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:24.577305 containerd[1474]: time="2025-02-13T16:18:24.576215064Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:24.578360 containerd[1474]: time="2025-02-13T16:18:24.578273898Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:24.578516 containerd[1474]: time="2025-02-13T16:18:24.578475765Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:24.578516 containerd[1474]: time="2025-02-13T16:18:24.578500985Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:24.580439 containerd[1474]: time="2025-02-13T16:18:24.580158649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:3,}" Feb 13 16:18:24.809117 containerd[1474]: time="2025-02-13T16:18:24.808055005Z" level=error msg="Failed to destroy network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.809951 containerd[1474]: time="2025-02-13T16:18:24.809704743Z" level=error msg="encountered an error cleaning up failed sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.809951 containerd[1474]: time="2025-02-13T16:18:24.809819294Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.810547 kubelet[1788]: E0213 16:18:24.810500 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.811179 kubelet[1788]: E0213 16:18:24.810864 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:24.811179 kubelet[1788]: E0213 16:18:24.810962 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:24.811179 kubelet[1788]: E0213 16:18:24.811072 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:24.841685 containerd[1474]: time="2025-02-13T16:18:24.841544681Z" level=error msg="Failed to destroy network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.843551 containerd[1474]: time="2025-02-13T16:18:24.843478525Z" level=error msg="encountered an error cleaning up failed sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.843980 containerd[1474]: time="2025-02-13T16:18:24.843941646Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:3,} failed, error" error="failed to setup network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.845144 kubelet[1788]: E0213 16:18:24.845090 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:24.845341 kubelet[1788]: E0213 16:18:24.845187 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:24.845341 kubelet[1788]: E0213 16:18:24.845247 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:24.845451 kubelet[1788]: E0213 16:18:24.845364 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:25.129949 kubelet[1788]: E0213 16:18:25.129801 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:25.498060 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36-shm.mount: Deactivated successfully. Feb 13 16:18:25.569262 kubelet[1788]: I0213 16:18:25.569087 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36" Feb 13 16:18:25.570645 containerd[1474]: time="2025-02-13T16:18:25.570180565Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:18:25.571196 containerd[1474]: time="2025-02-13T16:18:25.570683970Z" level=info msg="Ensure that sandbox be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36 in task-service has been cleanup successfully" Feb 13 16:18:25.574385 containerd[1474]: time="2025-02-13T16:18:25.573867147Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:18:25.574385 containerd[1474]: time="2025-02-13T16:18:25.573898422Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:18:25.575043 systemd[1]: run-netns-cni\x2d367e0add\x2def1c\x2d47c0\x2d090c\x2d2bddc2664010.mount: Deactivated successfully. Feb 13 16:18:25.575723 containerd[1474]: time="2025-02-13T16:18:25.575282731Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:25.575723 containerd[1474]: time="2025-02-13T16:18:25.575438963Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:25.575723 containerd[1474]: time="2025-02-13T16:18:25.575451513Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:25.578445 containerd[1474]: time="2025-02-13T16:18:25.577678141Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:25.578445 containerd[1474]: time="2025-02-13T16:18:25.577792603Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:25.578445 containerd[1474]: time="2025-02-13T16:18:25.577803644Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:25.579267 containerd[1474]: time="2025-02-13T16:18:25.579061653Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:25.579267 containerd[1474]: time="2025-02-13T16:18:25.579171858Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:25.579267 containerd[1474]: time="2025-02-13T16:18:25.579183650Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:25.579760 containerd[1474]: time="2025-02-13T16:18:25.579631789Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:25.579810 containerd[1474]: time="2025-02-13T16:18:25.579769551Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:25.579810 containerd[1474]: time="2025-02-13T16:18:25.579789017Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:25.582567 containerd[1474]: time="2025-02-13T16:18:25.581863382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:5,}" Feb 13 16:18:25.584269 kubelet[1788]: I0213 16:18:25.584094 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16" Feb 13 16:18:25.586502 containerd[1474]: time="2025-02-13T16:18:25.586454123Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:18:25.589264 containerd[1474]: time="2025-02-13T16:18:25.586884630Z" level=info msg="Ensure that sandbox e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16 in task-service has been cleanup successfully" Feb 13 16:18:25.590797 containerd[1474]: time="2025-02-13T16:18:25.590637853Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:18:25.590797 containerd[1474]: time="2025-02-13T16:18:25.590675081Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:18:25.591563 containerd[1474]: time="2025-02-13T16:18:25.591332885Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:25.591563 containerd[1474]: time="2025-02-13T16:18:25.591485911Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:25.591563 containerd[1474]: time="2025-02-13T16:18:25.591498633Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:25.592576 containerd[1474]: time="2025-02-13T16:18:25.592376511Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:25.592576 containerd[1474]: time="2025-02-13T16:18:25.592488857Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:25.592576 containerd[1474]: time="2025-02-13T16:18:25.592506720Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:25.592915 containerd[1474]: time="2025-02-13T16:18:25.592892701Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:25.593018 systemd[1]: run-netns-cni\x2dfe90c461\x2d46c0\x2dbe3a\x2d8a0c\x2d43749b423d5f.mount: Deactivated successfully. Feb 13 16:18:25.593386 containerd[1474]: time="2025-02-13T16:18:25.593097167Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:25.593386 containerd[1474]: time="2025-02-13T16:18:25.593117349Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:25.596397 containerd[1474]: time="2025-02-13T16:18:25.595899455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:4,}" Feb 13 16:18:25.836726 containerd[1474]: time="2025-02-13T16:18:25.836440045Z" level=error msg="Failed to destroy network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.838085 containerd[1474]: time="2025-02-13T16:18:25.837544582Z" level=error msg="encountered an error cleaning up failed sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.838085 containerd[1474]: time="2025-02-13T16:18:25.837667870Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:4,} failed, error" error="failed to setup network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.839547 kubelet[1788]: E0213 16:18:25.839513 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.839649 kubelet[1788]: E0213 16:18:25.839601 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:25.839649 kubelet[1788]: E0213 16:18:25.839636 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:25.839871 kubelet[1788]: E0213 16:18:25.839731 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:25.842798 containerd[1474]: time="2025-02-13T16:18:25.842580284Z" level=error msg="Failed to destroy network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.844060 containerd[1474]: time="2025-02-13T16:18:25.843349870Z" level=error msg="encountered an error cleaning up failed sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.844060 containerd[1474]: time="2025-02-13T16:18:25.843472636Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.844625 kubelet[1788]: E0213 16:18:25.844587 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:25.844722 kubelet[1788]: E0213 16:18:25.844708 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:25.844780 kubelet[1788]: E0213 16:18:25.844754 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:25.844879 kubelet[1788]: E0213 16:18:25.844858 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:26.132831 kubelet[1788]: E0213 16:18:26.131833 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:26.499827 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389-shm.mount: Deactivated successfully. Feb 13 16:18:26.592148 kubelet[1788]: I0213 16:18:26.592098 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389" Feb 13 16:18:26.593776 containerd[1474]: time="2025-02-13T16:18:26.593342684Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:18:26.593776 containerd[1474]: time="2025-02-13T16:18:26.593643718Z" level=info msg="Ensure that sandbox f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389 in task-service has been cleanup successfully" Feb 13 16:18:26.599361 containerd[1474]: time="2025-02-13T16:18:26.596682292Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:18:26.599361 containerd[1474]: time="2025-02-13T16:18:26.597025447Z" level=info msg="Ensure that sandbox bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6 in task-service has been cleanup successfully" Feb 13 16:18:26.599361 containerd[1474]: time="2025-02-13T16:18:26.598565499Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:18:26.599361 containerd[1474]: time="2025-02-13T16:18:26.598602986Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:18:26.599597 kubelet[1788]: I0213 16:18:26.596169 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6" Feb 13 16:18:26.599368 systemd[1]: run-netns-cni\x2dbb83ab66\x2d284b\x2d328b\x2d9203\x2d54f38b2abaca.mount: Deactivated successfully. Feb 13 16:18:26.600458 containerd[1474]: time="2025-02-13T16:18:26.600010065Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:18:26.600458 containerd[1474]: time="2025-02-13T16:18:26.600142735Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:18:26.600458 containerd[1474]: time="2025-02-13T16:18:26.600161908Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:18:26.601135 containerd[1474]: time="2025-02-13T16:18:26.601053718Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:18:26.601135 containerd[1474]: time="2025-02-13T16:18:26.601079033Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:18:26.602371 containerd[1474]: time="2025-02-13T16:18:26.602345670Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:26.603328 containerd[1474]: time="2025-02-13T16:18:26.603203730Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:26.603328 containerd[1474]: time="2025-02-13T16:18:26.603252754Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:26.603708 containerd[1474]: time="2025-02-13T16:18:26.602861004Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:18:26.603939 containerd[1474]: time="2025-02-13T16:18:26.603863510Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:18:26.603939 containerd[1474]: time="2025-02-13T16:18:26.603890799Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:18:26.604330 systemd[1]: run-netns-cni\x2dd1429aa7\x2dc5c7\x2d8207\x2dbb11\x2d8354917c4af9.mount: Deactivated successfully. Feb 13 16:18:26.607293 containerd[1474]: time="2025-02-13T16:18:26.606678093Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:26.607293 containerd[1474]: time="2025-02-13T16:18:26.606798405Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:26.607293 containerd[1474]: time="2025-02-13T16:18:26.606809441Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:26.607293 containerd[1474]: time="2025-02-13T16:18:26.606881221Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:26.607293 containerd[1474]: time="2025-02-13T16:18:26.606964482Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:26.607293 containerd[1474]: time="2025-02-13T16:18:26.606973304Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:26.607791 containerd[1474]: time="2025-02-13T16:18:26.607737943Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:26.607908 containerd[1474]: time="2025-02-13T16:18:26.607883906Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:26.608097 containerd[1474]: time="2025-02-13T16:18:26.608082670Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:26.608176 containerd[1474]: time="2025-02-13T16:18:26.608164923Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:26.608635 containerd[1474]: time="2025-02-13T16:18:26.607894644Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:26.608635 containerd[1474]: time="2025-02-13T16:18:26.608633223Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:26.609760 containerd[1474]: time="2025-02-13T16:18:26.609532531Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:26.609760 containerd[1474]: time="2025-02-13T16:18:26.609534739Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:26.609760 containerd[1474]: time="2025-02-13T16:18:26.609669763Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:26.609760 containerd[1474]: time="2025-02-13T16:18:26.609683738Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:26.609760 containerd[1474]: time="2025-02-13T16:18:26.609697230Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:26.609760 containerd[1474]: time="2025-02-13T16:18:26.609709358Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:26.611804 containerd[1474]: time="2025-02-13T16:18:26.610674113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:6,}" Feb 13 16:18:26.611804 containerd[1474]: time="2025-02-13T16:18:26.610909379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:5,}" Feb 13 16:18:26.801448 containerd[1474]: time="2025-02-13T16:18:26.799563830Z" level=error msg="Failed to destroy network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.811628 containerd[1474]: time="2025-02-13T16:18:26.811547390Z" level=error msg="encountered an error cleaning up failed sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.814505 containerd[1474]: time="2025-02-13T16:18:26.814426982Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.818317 kubelet[1788]: E0213 16:18:26.815089 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.818317 kubelet[1788]: E0213 16:18:26.815174 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:26.818317 kubelet[1788]: E0213 16:18:26.815212 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:26.818586 kubelet[1788]: E0213 16:18:26.815342 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:26.864712 containerd[1474]: time="2025-02-13T16:18:26.864620446Z" level=error msg="Failed to destroy network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.870651 containerd[1474]: time="2025-02-13T16:18:26.870372141Z" level=error msg="encountered an error cleaning up failed sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.871461 containerd[1474]: time="2025-02-13T16:18:26.871142214Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:5,} failed, error" error="failed to setup network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.875407 kubelet[1788]: E0213 16:18:26.872736 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:26.875407 kubelet[1788]: E0213 16:18:26.873043 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:26.875407 kubelet[1788]: E0213 16:18:26.873084 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:26.875564 kubelet[1788]: E0213 16:18:26.873171 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:27.133304 kubelet[1788]: E0213 16:18:27.132286 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:27.508078 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1-shm.mount: Deactivated successfully. Feb 13 16:18:27.619535 kubelet[1788]: I0213 16:18:27.618425 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1" Feb 13 16:18:27.623180 containerd[1474]: time="2025-02-13T16:18:27.622944132Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:18:27.624160 containerd[1474]: time="2025-02-13T16:18:27.624104172Z" level=info msg="Ensure that sandbox 3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1 in task-service has been cleanup successfully" Feb 13 16:18:27.628487 containerd[1474]: time="2025-02-13T16:18:27.627968945Z" level=info msg="TearDown network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" successfully" Feb 13 16:18:27.628487 containerd[1474]: time="2025-02-13T16:18:27.628046154Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" returns successfully" Feb 13 16:18:27.628846 systemd[1]: run-netns-cni\x2d83bec5be\x2d241a\x2df0b6\x2df9b6\x2d689cd9063377.mount: Deactivated successfully. Feb 13 16:18:27.632718 containerd[1474]: time="2025-02-13T16:18:27.632387762Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:18:27.632718 containerd[1474]: time="2025-02-13T16:18:27.632554069Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:18:27.632718 containerd[1474]: time="2025-02-13T16:18:27.632574656Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:18:27.634026 containerd[1474]: time="2025-02-13T16:18:27.633773479Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:18:27.634026 containerd[1474]: time="2025-02-13T16:18:27.633906786Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:18:27.634026 containerd[1474]: time="2025-02-13T16:18:27.633922772Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:18:27.636823 containerd[1474]: time="2025-02-13T16:18:27.636625476Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:27.636823 containerd[1474]: time="2025-02-13T16:18:27.636752024Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:27.636823 containerd[1474]: time="2025-02-13T16:18:27.636767620Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:27.637994 containerd[1474]: time="2025-02-13T16:18:27.637495275Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:27.637994 containerd[1474]: time="2025-02-13T16:18:27.637709879Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:27.637994 containerd[1474]: time="2025-02-13T16:18:27.637732654Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:27.638089 kubelet[1788]: I0213 16:18:27.638064 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1" Feb 13 16:18:27.638921 containerd[1474]: time="2025-02-13T16:18:27.638890276Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:27.639156 containerd[1474]: time="2025-02-13T16:18:27.639132662Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:27.639268 containerd[1474]: time="2025-02-13T16:18:27.639249323Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:27.639693 containerd[1474]: time="2025-02-13T16:18:27.639668681Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:27.640126 containerd[1474]: time="2025-02-13T16:18:27.640100359Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:27.642452 containerd[1474]: time="2025-02-13T16:18:27.642299006Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:27.642953 containerd[1474]: time="2025-02-13T16:18:27.642931851Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:18:27.643491 containerd[1474]: time="2025-02-13T16:18:27.643301212Z" level=info msg="Ensure that sandbox a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1 in task-service has been cleanup successfully" Feb 13 16:18:27.643735 containerd[1474]: time="2025-02-13T16:18:27.643716276Z" level=info msg="TearDown network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" successfully" Feb 13 16:18:27.646326 containerd[1474]: time="2025-02-13T16:18:27.646281943Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" returns successfully" Feb 13 16:18:27.647855 containerd[1474]: time="2025-02-13T16:18:27.647540658Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:18:27.648796 containerd[1474]: time="2025-02-13T16:18:27.648113218Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:18:27.648796 containerd[1474]: time="2025-02-13T16:18:27.648135458Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:18:27.648796 containerd[1474]: time="2025-02-13T16:18:27.648341070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:7,}" Feb 13 16:18:27.648423 systemd[1]: run-netns-cni\x2d9e6074ef\x2de3cc\x2d339f\x2da1ed\x2d8fb2d433d466.mount: Deactivated successfully. Feb 13 16:18:27.651142 containerd[1474]: time="2025-02-13T16:18:27.650689896Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:18:27.651142 containerd[1474]: time="2025-02-13T16:18:27.650838562Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:18:27.651142 containerd[1474]: time="2025-02-13T16:18:27.650855842Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:18:27.654341 containerd[1474]: time="2025-02-13T16:18:27.654088543Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:27.654341 containerd[1474]: time="2025-02-13T16:18:27.654211956Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:27.654341 containerd[1474]: time="2025-02-13T16:18:27.654245730Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:27.656003 containerd[1474]: time="2025-02-13T16:18:27.655944697Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:27.656894 containerd[1474]: time="2025-02-13T16:18:27.656430415Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:27.656894 containerd[1474]: time="2025-02-13T16:18:27.656470654Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:27.657840 containerd[1474]: time="2025-02-13T16:18:27.657417596Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:27.657840 containerd[1474]: time="2025-02-13T16:18:27.657587595Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:27.657840 containerd[1474]: time="2025-02-13T16:18:27.657606006Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:27.661866 containerd[1474]: time="2025-02-13T16:18:27.661811954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:6,}" Feb 13 16:18:27.827106 kubelet[1788]: I0213 16:18:27.826630 1788 topology_manager.go:215] "Topology Admit Handler" podUID="1301e2bc-4cf4-4cf7-8bdc-494597309b27" podNamespace="calico-system" podName="calico-typha-5856dc5d7f-4qnh9" Feb 13 16:18:27.850141 systemd[1]: Created slice kubepods-besteffort-pod1301e2bc_4cf4_4cf7_8bdc_494597309b27.slice - libcontainer container kubepods-besteffort-pod1301e2bc_4cf4_4cf7_8bdc_494597309b27.slice. Feb 13 16:18:27.875319 containerd[1474]: time="2025-02-13T16:18:27.875178048Z" level=error msg="Failed to destroy network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:27.875990 containerd[1474]: time="2025-02-13T16:18:27.875946236Z" level=error msg="encountered an error cleaning up failed sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:27.876190 containerd[1474]: time="2025-02-13T16:18:27.876161404Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:7,} failed, error" error="failed to setup network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:27.877291 kubelet[1788]: E0213 16:18:27.876691 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:27.877291 kubelet[1788]: E0213 16:18:27.876773 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:27.877291 kubelet[1788]: E0213 16:18:27.876806 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:27.877486 kubelet[1788]: E0213 16:18:27.876897 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:27.896153 kubelet[1788]: I0213 16:18:27.896108 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1301e2bc-4cf4-4cf7-8bdc-494597309b27-typha-certs\") pod \"calico-typha-5856dc5d7f-4qnh9\" (UID: \"1301e2bc-4cf4-4cf7-8bdc-494597309b27\") " pod="calico-system/calico-typha-5856dc5d7f-4qnh9" Feb 13 16:18:27.896743 kubelet[1788]: I0213 16:18:27.896650 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1301e2bc-4cf4-4cf7-8bdc-494597309b27-tigera-ca-bundle\") pod \"calico-typha-5856dc5d7f-4qnh9\" (UID: \"1301e2bc-4cf4-4cf7-8bdc-494597309b27\") " pod="calico-system/calico-typha-5856dc5d7f-4qnh9" Feb 13 16:18:27.896743 kubelet[1788]: I0213 16:18:27.896691 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmnf\" (UniqueName: \"kubernetes.io/projected/1301e2bc-4cf4-4cf7-8bdc-494597309b27-kube-api-access-kxmnf\") pod \"calico-typha-5856dc5d7f-4qnh9\" (UID: \"1301e2bc-4cf4-4cf7-8bdc-494597309b27\") " pod="calico-system/calico-typha-5856dc5d7f-4qnh9" Feb 13 16:18:28.050295 containerd[1474]: time="2025-02-13T16:18:28.050072812Z" level=error msg="Failed to destroy network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.050592 containerd[1474]: time="2025-02-13T16:18:28.050557333Z" level=error msg="encountered an error cleaning up failed sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.050684 containerd[1474]: time="2025-02-13T16:18:28.050656259Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:6,} failed, error" error="failed to setup network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.051508 kubelet[1788]: E0213 16:18:28.050972 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.051508 kubelet[1788]: E0213 16:18:28.051037 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:28.051508 kubelet[1788]: E0213 16:18:28.051067 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:28.051665 kubelet[1788]: E0213 16:18:28.051130 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:28.135584 kubelet[1788]: E0213 16:18:28.134119 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:28.155124 kubelet[1788]: E0213 16:18:28.154412 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:28.155349 containerd[1474]: time="2025-02-13T16:18:28.155195369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5856dc5d7f-4qnh9,Uid:1301e2bc-4cf4-4cf7-8bdc-494597309b27,Namespace:calico-system,Attempt:0,}" Feb 13 16:18:28.263463 containerd[1474]: time="2025-02-13T16:18:28.263322862Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:28.263463 containerd[1474]: time="2025-02-13T16:18:28.263387546Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:28.263463 containerd[1474]: time="2025-02-13T16:18:28.263401022Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:28.265369 containerd[1474]: time="2025-02-13T16:18:28.265128261Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:28.335611 systemd[1]: Started cri-containerd-7c02d8043a415d78039ea4042a791c4a12ca3746e8b16d2d499b855c8de93d8a.scope - libcontainer container 7c02d8043a415d78039ea4042a791c4a12ca3746e8b16d2d499b855c8de93d8a. Feb 13 16:18:28.462201 containerd[1474]: time="2025-02-13T16:18:28.462023263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5856dc5d7f-4qnh9,Uid:1301e2bc-4cf4-4cf7-8bdc-494597309b27,Namespace:calico-system,Attempt:0,} returns sandbox id \"7c02d8043a415d78039ea4042a791c4a12ca3746e8b16d2d499b855c8de93d8a\"" Feb 13 16:18:28.464427 kubelet[1788]: E0213 16:18:28.463610 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:28.519309 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894-shm.mount: Deactivated successfully. Feb 13 16:18:28.653379 kubelet[1788]: I0213 16:18:28.649936 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894" Feb 13 16:18:28.654103 containerd[1474]: time="2025-02-13T16:18:28.652687069Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" Feb 13 16:18:28.654103 containerd[1474]: time="2025-02-13T16:18:28.653343363Z" level=info msg="Ensure that sandbox 2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894 in task-service has been cleanup successfully" Feb 13 16:18:28.657883 containerd[1474]: time="2025-02-13T16:18:28.655395901Z" level=info msg="TearDown network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" successfully" Feb 13 16:18:28.657883 containerd[1474]: time="2025-02-13T16:18:28.655466721Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" returns successfully" Feb 13 16:18:28.657883 containerd[1474]: time="2025-02-13T16:18:28.657469613Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:18:28.657883 containerd[1474]: time="2025-02-13T16:18:28.657596716Z" level=info msg="TearDown network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" successfully" Feb 13 16:18:28.657883 containerd[1474]: time="2025-02-13T16:18:28.657698909Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" returns successfully" Feb 13 16:18:28.659427 systemd[1]: run-netns-cni\x2db8fdee73\x2dfd05\x2d2fbe\x2ddbb6\x2d27bc1185b852.mount: Deactivated successfully. Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.660561740Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.660845254Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.660888551Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.661478804Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.661597581Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.661617607Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.662890336Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.663115913Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:28.663385 containerd[1474]: time="2025-02-13T16:18:28.663160122Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:28.664937 containerd[1474]: time="2025-02-13T16:18:28.664313563Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:28.664937 containerd[1474]: time="2025-02-13T16:18:28.664501837Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:28.664937 containerd[1474]: time="2025-02-13T16:18:28.664533192Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:28.665993 containerd[1474]: time="2025-02-13T16:18:28.665472631Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:28.665993 containerd[1474]: time="2025-02-13T16:18:28.665589414Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:28.665993 containerd[1474]: time="2025-02-13T16:18:28.665605328Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:28.666419 containerd[1474]: time="2025-02-13T16:18:28.666135151Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:28.666419 containerd[1474]: time="2025-02-13T16:18:28.666350597Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:28.666419 containerd[1474]: time="2025-02-13T16:18:28.666386272Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:28.667441 containerd[1474]: time="2025-02-13T16:18:28.667407449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:8,}" Feb 13 16:18:28.672255 kubelet[1788]: I0213 16:18:28.671182 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73" Feb 13 16:18:28.673641 containerd[1474]: time="2025-02-13T16:18:28.673584130Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" Feb 13 16:18:28.674525 containerd[1474]: time="2025-02-13T16:18:28.674480689Z" level=info msg="Ensure that sandbox 625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73 in task-service has been cleanup successfully" Feb 13 16:18:28.678432 containerd[1474]: time="2025-02-13T16:18:28.678363374Z" level=info msg="TearDown network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" successfully" Feb 13 16:18:28.678631 containerd[1474]: time="2025-02-13T16:18:28.678611621Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" returns successfully" Feb 13 16:18:28.680148 systemd[1]: run-netns-cni\x2df875b7a3\x2dcb44\x2de548\x2d0c4c\x2d6b9915d0bc56.mount: Deactivated successfully. Feb 13 16:18:28.686209 containerd[1474]: time="2025-02-13T16:18:28.686138615Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:18:28.689606 containerd[1474]: time="2025-02-13T16:18:28.689550861Z" level=info msg="TearDown network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" successfully" Feb 13 16:18:28.690000 containerd[1474]: time="2025-02-13T16:18:28.689795684Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" returns successfully" Feb 13 16:18:28.693590 containerd[1474]: time="2025-02-13T16:18:28.693403763Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:18:28.693590 containerd[1474]: time="2025-02-13T16:18:28.693589804Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:18:28.694600 containerd[1474]: time="2025-02-13T16:18:28.693613252Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:18:28.697009 containerd[1474]: time="2025-02-13T16:18:28.695362179Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:18:28.697009 containerd[1474]: time="2025-02-13T16:18:28.696142630Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:18:28.697009 containerd[1474]: time="2025-02-13T16:18:28.696172677Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:18:28.699076 containerd[1474]: time="2025-02-13T16:18:28.698944730Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:28.699684 containerd[1474]: time="2025-02-13T16:18:28.699653815Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:28.702257 containerd[1474]: time="2025-02-13T16:18:28.700327042Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:28.705712 containerd[1474]: time="2025-02-13T16:18:28.705653694Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:28.705712 containerd[1474]: time="2025-02-13T16:18:28.705833392Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:28.708899 containerd[1474]: time="2025-02-13T16:18:28.705849585Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:28.718045 containerd[1474]: time="2025-02-13T16:18:28.715560698Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:28.718045 containerd[1474]: time="2025-02-13T16:18:28.715765127Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:28.718045 containerd[1474]: time="2025-02-13T16:18:28.715787981Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:28.721596 containerd[1474]: time="2025-02-13T16:18:28.721315043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:7,}" Feb 13 16:18:28.935277 containerd[1474]: time="2025-02-13T16:18:28.934212101Z" level=error msg="Failed to destroy network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.936803 containerd[1474]: time="2025-02-13T16:18:28.936707591Z" level=error msg="encountered an error cleaning up failed sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.936946 containerd[1474]: time="2025-02-13T16:18:28.936886194Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:8,} failed, error" error="failed to setup network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.938975 kubelet[1788]: E0213 16:18:28.937283 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.938975 kubelet[1788]: E0213 16:18:28.938490 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:28.938975 kubelet[1788]: E0213 16:18:28.938529 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:28.939277 kubelet[1788]: E0213 16:18:28.938642 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:28.988867 containerd[1474]: time="2025-02-13T16:18:28.988723244Z" level=error msg="Failed to destroy network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.989576 containerd[1474]: time="2025-02-13T16:18:28.989524403Z" level=error msg="encountered an error cleaning up failed sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.989973 containerd[1474]: time="2025-02-13T16:18:28.989821062Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:7,} failed, error" error="failed to setup network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.991094 kubelet[1788]: E0213 16:18:28.990574 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:28.991094 kubelet[1788]: E0213 16:18:28.990658 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:28.991094 kubelet[1788]: E0213 16:18:28.990693 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:28.991543 kubelet[1788]: E0213 16:18:28.990777 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:29.135361 kubelet[1788]: E0213 16:18:29.135258 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:29.503190 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585-shm.mount: Deactivated successfully. Feb 13 16:18:29.703920 kubelet[1788]: I0213 16:18:29.703835 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585" Feb 13 16:18:29.706521 containerd[1474]: time="2025-02-13T16:18:29.705564330Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\"" Feb 13 16:18:29.707945 containerd[1474]: time="2025-02-13T16:18:29.707683163Z" level=info msg="Ensure that sandbox 3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585 in task-service has been cleanup successfully" Feb 13 16:18:29.715529 kubelet[1788]: I0213 16:18:29.713697 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84" Feb 13 16:18:29.715702 containerd[1474]: time="2025-02-13T16:18:29.714474079Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\"" Feb 13 16:18:29.715702 containerd[1474]: time="2025-02-13T16:18:29.714704373Z" level=info msg="Ensure that sandbox 4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84 in task-service has been cleanup successfully" Feb 13 16:18:29.716220 containerd[1474]: time="2025-02-13T16:18:29.715885289Z" level=info msg="TearDown network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" successfully" Feb 13 16:18:29.716220 containerd[1474]: time="2025-02-13T16:18:29.715934853Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" returns successfully" Feb 13 16:18:29.716220 containerd[1474]: time="2025-02-13T16:18:29.715890291Z" level=info msg="TearDown network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" successfully" Feb 13 16:18:29.716220 containerd[1474]: time="2025-02-13T16:18:29.716035681Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" returns successfully" Feb 13 16:18:29.717251 containerd[1474]: time="2025-02-13T16:18:29.716854822Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" Feb 13 16:18:29.717251 containerd[1474]: time="2025-02-13T16:18:29.716993195Z" level=info msg="TearDown network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" successfully" Feb 13 16:18:29.717251 containerd[1474]: time="2025-02-13T16:18:29.717010818Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" returns successfully" Feb 13 16:18:29.717251 containerd[1474]: time="2025-02-13T16:18:29.717089538Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" Feb 13 16:18:29.717251 containerd[1474]: time="2025-02-13T16:18:29.717175008Z" level=info msg="TearDown network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" successfully" Feb 13 16:18:29.717251 containerd[1474]: time="2025-02-13T16:18:29.717189660Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" returns successfully" Feb 13 16:18:29.721278 containerd[1474]: time="2025-02-13T16:18:29.719818546Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:18:29.721278 containerd[1474]: time="2025-02-13T16:18:29.719984204Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:18:29.721278 containerd[1474]: time="2025-02-13T16:18:29.719996017Z" level=info msg="TearDown network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" successfully" Feb 13 16:18:29.721278 containerd[1474]: time="2025-02-13T16:18:29.720181820Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" returns successfully" Feb 13 16:18:29.720938 systemd[1]: run-netns-cni\x2d73c468c4\x2d6670\x2d3ecc\x2d3600\x2dca04ad201899.mount: Deactivated successfully. Feb 13 16:18:29.727269 containerd[1474]: time="2025-02-13T16:18:29.722707791Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:18:29.727269 containerd[1474]: time="2025-02-13T16:18:29.723787112Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:18:29.727269 containerd[1474]: time="2025-02-13T16:18:29.723951191Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:18:29.727269 containerd[1474]: time="2025-02-13T16:18:29.726351872Z" level=info msg="TearDown network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" successfully" Feb 13 16:18:29.727269 containerd[1474]: time="2025-02-13T16:18:29.726456093Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" returns successfully" Feb 13 16:18:29.728753 systemd[1]: run-netns-cni\x2d7b233ce3\x2d58e5\x2ddc6b\x2daa96\x2ddf207cc72400.mount: Deactivated successfully. Feb 13 16:18:29.730247 containerd[1474]: time="2025-02-13T16:18:29.729654584Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:18:29.732196 containerd[1474]: time="2025-02-13T16:18:29.732145624Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:18:29.733084 containerd[1474]: time="2025-02-13T16:18:29.730376218Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:18:29.733084 containerd[1474]: time="2025-02-13T16:18:29.732915584Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:18:29.733084 containerd[1474]: time="2025-02-13T16:18:29.732998278Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:18:29.733084 containerd[1474]: time="2025-02-13T16:18:29.733016294Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:18:29.734678 containerd[1474]: time="2025-02-13T16:18:29.734288866Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:18:29.734678 containerd[1474]: time="2025-02-13T16:18:29.734417145Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:18:29.734678 containerd[1474]: time="2025-02-13T16:18:29.734433672Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:18:29.734678 containerd[1474]: time="2025-02-13T16:18:29.734527272Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:29.734678 containerd[1474]: time="2025-02-13T16:18:29.734607254Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:29.734678 containerd[1474]: time="2025-02-13T16:18:29.734620064Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:29.737012 containerd[1474]: time="2025-02-13T16:18:29.736961978Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:29.737886 containerd[1474]: time="2025-02-13T16:18:29.736969066Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:29.738162 containerd[1474]: time="2025-02-13T16:18:29.738011393Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:29.738162 containerd[1474]: time="2025-02-13T16:18:29.738032465Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:29.738162 containerd[1474]: time="2025-02-13T16:18:29.738077185Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:29.738162 containerd[1474]: time="2025-02-13T16:18:29.738102837Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:29.739112 containerd[1474]: time="2025-02-13T16:18:29.738796166Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:29.739112 containerd[1474]: time="2025-02-13T16:18:29.738895017Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:29.739112 containerd[1474]: time="2025-02-13T16:18:29.738963766Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:29.739770 containerd[1474]: time="2025-02-13T16:18:29.738964439Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:29.741092 containerd[1474]: time="2025-02-13T16:18:29.740835893Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:29.741092 containerd[1474]: time="2025-02-13T16:18:29.740981742Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:29.741092 containerd[1474]: time="2025-02-13T16:18:29.740998337Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:29.741619 containerd[1474]: time="2025-02-13T16:18:29.741396352Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:29.741619 containerd[1474]: time="2025-02-13T16:18:29.741420372Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:29.744556 containerd[1474]: time="2025-02-13T16:18:29.743731503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:8,}" Feb 13 16:18:29.744556 containerd[1474]: time="2025-02-13T16:18:29.744144398Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:29.744556 containerd[1474]: time="2025-02-13T16:18:29.744432355Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:29.744556 containerd[1474]: time="2025-02-13T16:18:29.744455267Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:29.745977 containerd[1474]: time="2025-02-13T16:18:29.745941969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:9,}" Feb 13 16:18:30.057527 containerd[1474]: time="2025-02-13T16:18:30.056980661Z" level=error msg="Failed to destroy network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.060648 containerd[1474]: time="2025-02-13T16:18:30.060211720Z" level=error msg="encountered an error cleaning up failed sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.060648 containerd[1474]: time="2025-02-13T16:18:30.060452840Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:8,} failed, error" error="failed to setup network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.061306 kubelet[1788]: E0213 16:18:30.061182 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.061868 kubelet[1788]: E0213 16:18:30.061835 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:30.061989 kubelet[1788]: E0213 16:18:30.061890 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:30.062129 kubelet[1788]: E0213 16:18:30.061998 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:30.087817 containerd[1474]: time="2025-02-13T16:18:30.087316537Z" level=error msg="Failed to destroy network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.088551 containerd[1474]: time="2025-02-13T16:18:30.088492983Z" level=error msg="encountered an error cleaning up failed sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.088761 containerd[1474]: time="2025-02-13T16:18:30.088730528Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:9,} failed, error" error="failed to setup network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.093482 kubelet[1788]: E0213 16:18:30.092611 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.093482 kubelet[1788]: E0213 16:18:30.092709 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:30.093482 kubelet[1788]: E0213 16:18:30.092752 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:30.093733 kubelet[1788]: E0213 16:18:30.092859 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:30.136205 kubelet[1788]: E0213 16:18:30.136124 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:30.218766 systemd[1]: Started sshd@11-64.227.101.255:22-161.35.231.77:35116.service - OpenSSH per-connection server daemon (161.35.231.77:35116). Feb 13 16:18:30.326379 kubelet[1788]: I0213 16:18:30.323293 1788 topology_manager.go:215] "Topology Admit Handler" podUID="65a0808a-15de-4f43-bc29-6bb453f1a0be" podNamespace="calico-system" podName="calico-kube-controllers-69d5998dc-cpdsh" Feb 13 16:18:30.347479 systemd[1]: Created slice kubepods-besteffort-pod65a0808a_15de_4f43_bc29_6bb453f1a0be.slice - libcontainer container kubepods-besteffort-pod65a0808a_15de_4f43_bc29_6bb453f1a0be.slice. Feb 13 16:18:30.369465 sshd[2941]: Invalid user pwserver from 161.35.231.77 port 35116 Feb 13 16:18:30.397590 sshd[2941]: Received disconnect from 161.35.231.77 port 35116:11: Bye Bye [preauth] Feb 13 16:18:30.397590 sshd[2941]: Disconnected from invalid user pwserver 161.35.231.77 port 35116 [preauth] Feb 13 16:18:30.406107 systemd[1]: sshd@11-64.227.101.255:22-161.35.231.77:35116.service: Deactivated successfully. Feb 13 16:18:30.424623 kubelet[1788]: I0213 16:18:30.424547 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a0808a-15de-4f43-bc29-6bb453f1a0be-tigera-ca-bundle\") pod \"calico-kube-controllers-69d5998dc-cpdsh\" (UID: \"65a0808a-15de-4f43-bc29-6bb453f1a0be\") " pod="calico-system/calico-kube-controllers-69d5998dc-cpdsh" Feb 13 16:18:30.424840 kubelet[1788]: I0213 16:18:30.424648 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt86p\" (UniqueName: \"kubernetes.io/projected/65a0808a-15de-4f43-bc29-6bb453f1a0be-kube-api-access-rt86p\") pod \"calico-kube-controllers-69d5998dc-cpdsh\" (UID: \"65a0808a-15de-4f43-bc29-6bb453f1a0be\") " pod="calico-system/calico-kube-controllers-69d5998dc-cpdsh" Feb 13 16:18:30.503411 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c-shm.mount: Deactivated successfully. Feb 13 16:18:30.656050 containerd[1474]: time="2025-02-13T16:18:30.655631386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d5998dc-cpdsh,Uid:65a0808a-15de-4f43-bc29-6bb453f1a0be,Namespace:calico-system,Attempt:0,}" Feb 13 16:18:30.734109 kubelet[1788]: I0213 16:18:30.734057 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f" Feb 13 16:18:30.738076 containerd[1474]: time="2025-02-13T16:18:30.737406607Z" level=info msg="StopPodSandbox for \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\"" Feb 13 16:18:30.738580 containerd[1474]: time="2025-02-13T16:18:30.738486483Z" level=info msg="Ensure that sandbox 2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f in task-service has been cleanup successfully" Feb 13 16:18:30.740577 containerd[1474]: time="2025-02-13T16:18:30.738795535Z" level=info msg="TearDown network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" successfully" Feb 13 16:18:30.740577 containerd[1474]: time="2025-02-13T16:18:30.738829960Z" level=info msg="StopPodSandbox for \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" returns successfully" Feb 13 16:18:30.743004 containerd[1474]: time="2025-02-13T16:18:30.742926176Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\"" Feb 13 16:18:30.743144 containerd[1474]: time="2025-02-13T16:18:30.743105478Z" level=info msg="TearDown network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" successfully" Feb 13 16:18:30.743144 containerd[1474]: time="2025-02-13T16:18:30.743127371Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" returns successfully" Feb 13 16:18:30.744526 systemd[1]: run-netns-cni\x2dc1d61b03\x2daa0f\x2d0478\x2dd57e\x2d7bc1fb1b0054.mount: Deactivated successfully. Feb 13 16:18:30.745285 containerd[1474]: time="2025-02-13T16:18:30.745044438Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" Feb 13 16:18:30.745285 containerd[1474]: time="2025-02-13T16:18:30.745270493Z" level=info msg="TearDown network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" successfully" Feb 13 16:18:30.745285 containerd[1474]: time="2025-02-13T16:18:30.745283981Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" returns successfully" Feb 13 16:18:30.747316 containerd[1474]: time="2025-02-13T16:18:30.746605497Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:18:30.747316 containerd[1474]: time="2025-02-13T16:18:30.746805669Z" level=info msg="TearDown network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" successfully" Feb 13 16:18:30.747316 containerd[1474]: time="2025-02-13T16:18:30.746824338Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" returns successfully" Feb 13 16:18:30.750621 containerd[1474]: time="2025-02-13T16:18:30.749736775Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:18:30.750621 containerd[1474]: time="2025-02-13T16:18:30.749882820Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:18:30.750621 containerd[1474]: time="2025-02-13T16:18:30.749900687Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:18:30.753391 containerd[1474]: time="2025-02-13T16:18:30.753336311Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:18:30.753653 containerd[1474]: time="2025-02-13T16:18:30.753499155Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:18:30.753653 containerd[1474]: time="2025-02-13T16:18:30.753519136Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:18:30.755362 containerd[1474]: time="2025-02-13T16:18:30.755131539Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:30.757485 containerd[1474]: time="2025-02-13T16:18:30.757254871Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:30.757485 containerd[1474]: time="2025-02-13T16:18:30.757288025Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:30.758032 containerd[1474]: time="2025-02-13T16:18:30.758002173Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:30.758478 containerd[1474]: time="2025-02-13T16:18:30.758450251Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:30.758637 containerd[1474]: time="2025-02-13T16:18:30.758471473Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:30.759132 kubelet[1788]: I0213 16:18:30.758979 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c" Feb 13 16:18:30.761060 containerd[1474]: time="2025-02-13T16:18:30.760992226Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:30.761779 containerd[1474]: time="2025-02-13T16:18:30.761003769Z" level=info msg="StopPodSandbox for \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\"" Feb 13 16:18:30.761779 containerd[1474]: time="2025-02-13T16:18:30.761158230Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:30.761779 containerd[1474]: time="2025-02-13T16:18:30.761177882Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:30.761779 containerd[1474]: time="2025-02-13T16:18:30.761540540Z" level=info msg="Ensure that sandbox 48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c in task-service has been cleanup successfully" Feb 13 16:18:30.762488 containerd[1474]: time="2025-02-13T16:18:30.762186650Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:30.762488 containerd[1474]: time="2025-02-13T16:18:30.762359733Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:30.762488 containerd[1474]: time="2025-02-13T16:18:30.762378476Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:30.763480 containerd[1474]: time="2025-02-13T16:18:30.763438249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:10,}" Feb 13 16:18:30.769270 containerd[1474]: time="2025-02-13T16:18:30.766885378Z" level=info msg="TearDown network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" successfully" Feb 13 16:18:30.770811 containerd[1474]: time="2025-02-13T16:18:30.770269111Z" level=info msg="StopPodSandbox for \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" returns successfully" Feb 13 16:18:30.774202 containerd[1474]: time="2025-02-13T16:18:30.772198250Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\"" Feb 13 16:18:30.774202 containerd[1474]: time="2025-02-13T16:18:30.773331253Z" level=info msg="TearDown network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" successfully" Feb 13 16:18:30.774202 containerd[1474]: time="2025-02-13T16:18:30.773395188Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" returns successfully" Feb 13 16:18:30.773820 systemd[1]: run-netns-cni\x2df6ce5631\x2dad21\x2d0717\x2d2adf\x2d0d899d11e2b8.mount: Deactivated successfully. Feb 13 16:18:30.775814 containerd[1474]: time="2025-02-13T16:18:30.775297252Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" Feb 13 16:18:30.775814 containerd[1474]: time="2025-02-13T16:18:30.775535130Z" level=info msg="TearDown network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" successfully" Feb 13 16:18:30.775814 containerd[1474]: time="2025-02-13T16:18:30.775550103Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" returns successfully" Feb 13 16:18:30.778145 containerd[1474]: time="2025-02-13T16:18:30.778083668Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:18:30.778513 containerd[1474]: time="2025-02-13T16:18:30.778476025Z" level=info msg="TearDown network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" successfully" Feb 13 16:18:30.778513 containerd[1474]: time="2025-02-13T16:18:30.778505947Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" returns successfully" Feb 13 16:18:30.783154 containerd[1474]: time="2025-02-13T16:18:30.782064748Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:18:30.784861 containerd[1474]: time="2025-02-13T16:18:30.784279008Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:18:30.785086 containerd[1474]: time="2025-02-13T16:18:30.785049796Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:18:30.787021 containerd[1474]: time="2025-02-13T16:18:30.786789228Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:18:30.789334 containerd[1474]: time="2025-02-13T16:18:30.789194535Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:18:30.789462 containerd[1474]: time="2025-02-13T16:18:30.789352443Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:18:30.791076 containerd[1474]: time="2025-02-13T16:18:30.791025690Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:30.791259 containerd[1474]: time="2025-02-13T16:18:30.791163326Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:30.791259 containerd[1474]: time="2025-02-13T16:18:30.791181991Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:30.792254 containerd[1474]: time="2025-02-13T16:18:30.791622436Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:30.792254 containerd[1474]: time="2025-02-13T16:18:30.791717645Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:30.792254 containerd[1474]: time="2025-02-13T16:18:30.791731167Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:30.794981 containerd[1474]: time="2025-02-13T16:18:30.794426077Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:30.795244 containerd[1474]: time="2025-02-13T16:18:30.795192618Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:30.796492 containerd[1474]: time="2025-02-13T16:18:30.796456193Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:30.806819 containerd[1474]: time="2025-02-13T16:18:30.806751971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:9,}" Feb 13 16:18:30.885068 systemd[1]: Started sshd@12-64.227.101.255:22-182.16.245.79:45436.service - OpenSSH per-connection server daemon (182.16.245.79:45436). Feb 13 16:18:30.907702 containerd[1474]: time="2025-02-13T16:18:30.906585699Z" level=error msg="Failed to destroy network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.907702 containerd[1474]: time="2025-02-13T16:18:30.907293103Z" level=error msg="encountered an error cleaning up failed sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.907702 containerd[1474]: time="2025-02-13T16:18:30.907375857Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d5998dc-cpdsh,Uid:65a0808a-15de-4f43-bc29-6bb453f1a0be,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.908452 kubelet[1788]: E0213 16:18:30.908397 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:30.908627 kubelet[1788]: E0213 16:18:30.908496 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69d5998dc-cpdsh" Feb 13 16:18:30.908627 kubelet[1788]: E0213 16:18:30.908532 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-69d5998dc-cpdsh" Feb 13 16:18:30.908891 kubelet[1788]: E0213 16:18:30.908630 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-69d5998dc-cpdsh_calico-system(65a0808a-15de-4f43-bc29-6bb453f1a0be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-69d5998dc-cpdsh_calico-system(65a0808a-15de-4f43-bc29-6bb453f1a0be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-69d5998dc-cpdsh" podUID="65a0808a-15de-4f43-bc29-6bb453f1a0be" Feb 13 16:18:31.039642 containerd[1474]: time="2025-02-13T16:18:31.039511195Z" level=error msg="Failed to destroy network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.042273 containerd[1474]: time="2025-02-13T16:18:31.041076995Z" level=error msg="encountered an error cleaning up failed sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.042535 containerd[1474]: time="2025-02-13T16:18:31.042476369Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:10,} failed, error" error="failed to setup network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.043015 kubelet[1788]: E0213 16:18:31.042977 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.043180 kubelet[1788]: E0213 16:18:31.043050 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:31.043180 kubelet[1788]: E0213 16:18:31.043078 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-jlrrr" Feb 13 16:18:31.043180 kubelet[1788]: E0213 16:18:31.043161 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-jlrrr_calico-system(63a56fc8-68aa-4d63-8400-3078bd2ff61f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-jlrrr" podUID="63a56fc8-68aa-4d63-8400-3078bd2ff61f" Feb 13 16:18:31.069440 containerd[1474]: time="2025-02-13T16:18:31.069262896Z" level=error msg="Failed to destroy network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.070266 containerd[1474]: time="2025-02-13T16:18:31.069995303Z" level=error msg="encountered an error cleaning up failed sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.070266 containerd[1474]: time="2025-02-13T16:18:31.070104444Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:9,} failed, error" error="failed to setup network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.071852 kubelet[1788]: E0213 16:18:31.071613 1788 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:18:31.071852 kubelet[1788]: E0213 16:18:31.071700 1788 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:31.072202 kubelet[1788]: E0213 16:18:31.072110 1788 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="default/nginx-deployment-6d5f899847-t8l5b" Feb 13 16:18:31.075250 kubelet[1788]: E0213 16:18:31.073805 1788 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"nginx-deployment-6d5f899847-t8l5b_default(f6f14251-baf9-4c3b-8963-ab140ec1e4a0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="default/nginx-deployment-6d5f899847-t8l5b" podUID="f6f14251-baf9-4c3b-8963-ab140ec1e4a0" Feb 13 16:18:31.119128 containerd[1474]: time="2025-02-13T16:18:31.119051000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:31.122632 containerd[1474]: time="2025-02-13T16:18:31.122541702Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 16:18:31.128201 containerd[1474]: time="2025-02-13T16:18:31.128120830Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:31.132222 containerd[1474]: time="2025-02-13T16:18:31.132140796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:31.135728 containerd[1474]: time="2025-02-13T16:18:31.135640818Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 11.650679087s" Feb 13 16:18:31.135728 containerd[1474]: time="2025-02-13T16:18:31.135727571Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 16:18:31.136645 kubelet[1788]: E0213 16:18:31.136595 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:31.136975 containerd[1474]: time="2025-02-13T16:18:31.136934980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Feb 13 16:18:31.149719 containerd[1474]: time="2025-02-13T16:18:31.149547401Z" level=info msg="CreateContainer within sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 16:18:31.204603 containerd[1474]: time="2025-02-13T16:18:31.204534590Z" level=info msg="CreateContainer within sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\"" Feb 13 16:18:31.206092 containerd[1474]: time="2025-02-13T16:18:31.205850775Z" level=info msg="StartContainer for \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\"" Feb 13 16:18:31.243660 systemd[1]: Started cri-containerd-b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5.scope - libcontainer container b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5. Feb 13 16:18:31.299108 containerd[1474]: time="2025-02-13T16:18:31.299029571Z" level=info msg="StartContainer for \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\" returns successfully" Feb 13 16:18:31.419344 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 16:18:31.419541 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 16:18:31.505652 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b-shm.mount: Deactivated successfully. Feb 13 16:18:31.506573 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2751595452.mount: Deactivated successfully. Feb 13 16:18:31.771342 kubelet[1788]: I0213 16:18:31.771166 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84" Feb 13 16:18:31.774466 containerd[1474]: time="2025-02-13T16:18:31.772120232Z" level=info msg="StopPodSandbox for \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\"" Feb 13 16:18:31.774466 containerd[1474]: time="2025-02-13T16:18:31.772388875Z" level=info msg="Ensure that sandbox 534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84 in task-service has been cleanup successfully" Feb 13 16:18:31.774466 containerd[1474]: time="2025-02-13T16:18:31.774346109Z" level=info msg="TearDown network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\" successfully" Feb 13 16:18:31.774466 containerd[1474]: time="2025-02-13T16:18:31.774386645Z" level=info msg="StopPodSandbox for \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\" returns successfully" Feb 13 16:18:31.779495 containerd[1474]: time="2025-02-13T16:18:31.775352295Z" level=info msg="StopPodSandbox for \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\"" Feb 13 16:18:31.779495 containerd[1474]: time="2025-02-13T16:18:31.775516179Z" level=info msg="TearDown network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" successfully" Feb 13 16:18:31.779495 containerd[1474]: time="2025-02-13T16:18:31.775528796Z" level=info msg="StopPodSandbox for \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" returns successfully" Feb 13 16:18:31.779495 containerd[1474]: time="2025-02-13T16:18:31.778473215Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\"" Feb 13 16:18:31.779495 containerd[1474]: time="2025-02-13T16:18:31.778657075Z" level=info msg="TearDown network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" successfully" Feb 13 16:18:31.779495 containerd[1474]: time="2025-02-13T16:18:31.778678298Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" returns successfully" Feb 13 16:18:31.779397 systemd[1]: run-netns-cni\x2d5f19c7fb\x2d0f2b\x2dbd4a\x2d00e1\x2d453d3c057fb0.mount: Deactivated successfully. Feb 13 16:18:31.781340 containerd[1474]: time="2025-02-13T16:18:31.780045370Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" Feb 13 16:18:31.781340 containerd[1474]: time="2025-02-13T16:18:31.780169393Z" level=info msg="TearDown network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" successfully" Feb 13 16:18:31.781340 containerd[1474]: time="2025-02-13T16:18:31.780185630Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" returns successfully" Feb 13 16:18:31.781340 containerd[1474]: time="2025-02-13T16:18:31.780812452Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:18:31.781340 containerd[1474]: time="2025-02-13T16:18:31.781131354Z" level=info msg="TearDown network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" successfully" Feb 13 16:18:31.781340 containerd[1474]: time="2025-02-13T16:18:31.781155111Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" returns successfully" Feb 13 16:18:31.783914 containerd[1474]: time="2025-02-13T16:18:31.783881213Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:18:31.784492 containerd[1474]: time="2025-02-13T16:18:31.784346114Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:18:31.784492 containerd[1474]: time="2025-02-13T16:18:31.784410364Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:18:31.785699 containerd[1474]: time="2025-02-13T16:18:31.785552463Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:18:31.786376 containerd[1474]: time="2025-02-13T16:18:31.786247808Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:18:31.786376 containerd[1474]: time="2025-02-13T16:18:31.786275246Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:18:31.786646 kubelet[1788]: I0213 16:18:31.786300 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a" Feb 13 16:18:31.787537 containerd[1474]: time="2025-02-13T16:18:31.787506235Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:18:31.787658 containerd[1474]: time="2025-02-13T16:18:31.787630373Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:18:31.787658 containerd[1474]: time="2025-02-13T16:18:31.787647848Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:18:31.788663 kubelet[1788]: I0213 16:18:31.788621 1788 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b" Feb 13 16:18:31.789126 containerd[1474]: time="2025-02-13T16:18:31.789077981Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:18:31.789219 containerd[1474]: time="2025-02-13T16:18:31.789174699Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:18:31.789219 containerd[1474]: time="2025-02-13T16:18:31.789185012Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:18:31.790130 containerd[1474]: time="2025-02-13T16:18:31.789630380Z" level=info msg="StopPodSandbox for \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\"" Feb 13 16:18:31.790130 containerd[1474]: time="2025-02-13T16:18:31.789721867Z" level=info msg="StopPodSandbox for \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\"" Feb 13 16:18:31.790130 containerd[1474]: time="2025-02-13T16:18:31.789895035Z" level=info msg="Ensure that sandbox 8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b in task-service has been cleanup successfully" Feb 13 16:18:31.790130 containerd[1474]: time="2025-02-13T16:18:31.789962993Z" level=info msg="Ensure that sandbox 5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a in task-service has been cleanup successfully" Feb 13 16:18:31.790542 containerd[1474]: time="2025-02-13T16:18:31.790462820Z" level=info msg="TearDown network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\" successfully" Feb 13 16:18:31.790542 containerd[1474]: time="2025-02-13T16:18:31.790502412Z" level=info msg="StopPodSandbox for \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\" returns successfully" Feb 13 16:18:31.792299 containerd[1474]: time="2025-02-13T16:18:31.791112187Z" level=info msg="TearDown network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\" successfully" Feb 13 16:18:31.794495 containerd[1474]: time="2025-02-13T16:18:31.794338440Z" level=info msg="StopPodSandbox for \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\" returns successfully" Feb 13 16:18:31.794495 containerd[1474]: time="2025-02-13T16:18:31.791124504Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:18:31.796633 containerd[1474]: time="2025-02-13T16:18:31.791243620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d5998dc-cpdsh,Uid:65a0808a-15de-4f43-bc29-6bb453f1a0be,Namespace:calico-system,Attempt:1,}" Feb 13 16:18:31.799267 containerd[1474]: time="2025-02-13T16:18:31.797035924Z" level=info msg="StopPodSandbox for \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\"" Feb 13 16:18:31.799267 containerd[1474]: time="2025-02-13T16:18:31.797265225Z" level=info msg="TearDown network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" successfully" Feb 13 16:18:31.799267 containerd[1474]: time="2025-02-13T16:18:31.797290242Z" level=info msg="StopPodSandbox for \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" returns successfully" Feb 13 16:18:31.799267 containerd[1474]: time="2025-02-13T16:18:31.797359880Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:18:31.799267 containerd[1474]: time="2025-02-13T16:18:31.797377221Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:18:31.797943 systemd[1]: run-netns-cni\x2d86cde09a\x2d0fb3\x2de02d\x2dd968\x2d8dc4bcf6f1db.mount: Deactivated successfully. Feb 13 16:18:31.798099 systemd[1]: run-netns-cni\x2d3a1c0996\x2dadc4\x2d8842\x2da9aa\x2dfe47d36fcda3.mount: Deactivated successfully. Feb 13 16:18:31.804039 containerd[1474]: time="2025-02-13T16:18:31.802688863Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\"" Feb 13 16:18:31.804039 containerd[1474]: time="2025-02-13T16:18:31.802827553Z" level=info msg="TearDown network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" successfully" Feb 13 16:18:31.804039 containerd[1474]: time="2025-02-13T16:18:31.802839868Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" returns successfully" Feb 13 16:18:31.804039 containerd[1474]: time="2025-02-13T16:18:31.803139601Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:18:31.804514 containerd[1474]: time="2025-02-13T16:18:31.804483125Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:18:31.804620 containerd[1474]: time="2025-02-13T16:18:31.804599477Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:18:31.805467 containerd[1474]: time="2025-02-13T16:18:31.805432015Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" Feb 13 16:18:31.805676 containerd[1474]: time="2025-02-13T16:18:31.805657072Z" level=info msg="TearDown network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" successfully" Feb 13 16:18:31.805732 containerd[1474]: time="2025-02-13T16:18:31.805721912Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" returns successfully" Feb 13 16:18:31.805886 containerd[1474]: time="2025-02-13T16:18:31.805872022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:11,}" Feb 13 16:18:31.808013 containerd[1474]: time="2025-02-13T16:18:31.807940837Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:18:31.808511 containerd[1474]: time="2025-02-13T16:18:31.808484341Z" level=info msg="TearDown network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" successfully" Feb 13 16:18:31.808591 containerd[1474]: time="2025-02-13T16:18:31.808579736Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" returns successfully" Feb 13 16:18:31.810344 containerd[1474]: time="2025-02-13T16:18:31.809846642Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:18:31.810344 containerd[1474]: time="2025-02-13T16:18:31.810209476Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:18:31.810344 containerd[1474]: time="2025-02-13T16:18:31.810273900Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:18:31.814240 containerd[1474]: time="2025-02-13T16:18:31.813944952Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:18:31.814240 containerd[1474]: time="2025-02-13T16:18:31.814115835Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:18:31.814240 containerd[1474]: time="2025-02-13T16:18:31.814132369Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:18:31.817293 containerd[1474]: time="2025-02-13T16:18:31.816137422Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:18:31.817293 containerd[1474]: time="2025-02-13T16:18:31.816599660Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:18:31.817293 containerd[1474]: time="2025-02-13T16:18:31.816624358Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:18:31.818161 containerd[1474]: time="2025-02-13T16:18:31.818003276Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:18:31.818559 containerd[1474]: time="2025-02-13T16:18:31.818528279Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:18:31.818615 containerd[1474]: time="2025-02-13T16:18:31.818556452Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:18:31.819304 containerd[1474]: time="2025-02-13T16:18:31.819274825Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:18:31.819414 containerd[1474]: time="2025-02-13T16:18:31.819393892Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:18:31.819446 containerd[1474]: time="2025-02-13T16:18:31.819417420Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:18:31.820909 containerd[1474]: time="2025-02-13T16:18:31.820859292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:10,}" Feb 13 16:18:31.978342 sshd[2989]: Invalid user cloud-user from 182.16.245.79 port 45436 Feb 13 16:18:32.005093 update_engine[1454]: I20250213 16:18:32.004305 1454 update_attempter.cc:509] Updating boot flags... Feb 13 16:18:32.138673 kubelet[1788]: E0213 16:18:32.138496 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:32.159013 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 37 scanned by (udev-worker) (3186) Feb 13 16:18:32.174338 sshd[2989]: Received disconnect from 182.16.245.79 port 45436:11: Bye Bye [preauth] Feb 13 16:18:32.176268 sshd[2989]: Disconnected from invalid user cloud-user 182.16.245.79 port 45436 [preauth] Feb 13 16:18:32.179850 systemd[1]: sshd@12-64.227.101.255:22-182.16.245.79:45436.service: Deactivated successfully. Feb 13 16:18:32.271581 containerd[1474]: time="2025-02-13T16:18:32.271532620Z" level=info msg="StopContainer for \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\" with timeout 5 (s)" Feb 13 16:18:32.278170 containerd[1474]: time="2025-02-13T16:18:32.278126742Z" level=info msg="Stop container \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\" with signal terminated" Feb 13 16:18:32.577358 systemd-networkd[1377]: cali0d734bb21a0: Link UP Feb 13 16:18:32.577863 systemd-networkd[1377]: cali0d734bb21a0: Gained carrier Feb 13 16:18:32.612183 kubelet[1788]: I0213 16:18:32.608218 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-hpnmv" podStartSLOduration=5.201394742 podStartE2EDuration="30.608145512s" podCreationTimestamp="2025-02-13 16:18:02 +0000 UTC" firstStartedPulling="2025-02-13 16:18:05.729559187 +0000 UTC m=+4.424537055" lastFinishedPulling="2025-02-13 16:18:31.136309956 +0000 UTC m=+29.831287825" observedRunningTime="2025-02-13 16:18:31.833273636 +0000 UTC m=+30.528251530" watchObservedRunningTime="2025-02-13 16:18:32.608145512 +0000 UTC m=+31.303123395" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.068 [INFO][3151] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.118 [INFO][3151] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0 nginx-deployment-6d5f899847- default f6f14251-baf9-4c3b-8963-ab140ec1e4a0 1203 0 2025-02-13 16:18:21 +0000 UTC map[app:nginx pod-template-hash:6d5f899847 projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 64.227.101.255 nginx-deployment-6d5f899847-t8l5b eth0 default [] [] [kns.default ksa.default.default] cali0d734bb21a0 [] []}} ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.118 [INFO][3151] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.380 [INFO][3194] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" HandleID="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Workload="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.404 [INFO][3194] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" HandleID="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Workload="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000266280), Attrs:map[string]string{"namespace":"default", "node":"64.227.101.255", "pod":"nginx-deployment-6d5f899847-t8l5b", "timestamp":"2025-02-13 16:18:32.380194773 +0000 UTC"}, Hostname:"64.227.101.255", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.405 [INFO][3194] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.410 [INFO][3194] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.410 [INFO][3194] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '64.227.101.255' Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.420 [INFO][3194] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.449 [INFO][3194] ipam/ipam.go 372: Looking up existing affinities for host host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.464 [INFO][3194] ipam/ipam.go 489: Trying affinity for 192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.471 [INFO][3194] ipam/ipam.go 155: Attempting to load block cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.482 [INFO][3194] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.482 [INFO][3194] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.119.128/26 handle="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.489 [INFO][3194] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.507 [INFO][3194] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.119.128/26 handle="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.540 [INFO][3194] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.119.129/26] block=192.168.119.128/26 handle="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.543 [INFO][3194] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.119.129/26] handle="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" host="64.227.101.255" Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.543 [INFO][3194] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:18:32.630287 containerd[1474]: 2025-02-13 16:18:32.543 [INFO][3194] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.129/26] IPv6=[] ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" HandleID="k8s-pod-network.b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Workload="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" Feb 13 16:18:32.633513 containerd[1474]: 2025-02-13 16:18:32.553 [INFO][3151] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0", GenerateName:"nginx-deployment-6d5f899847-", Namespace:"default", SelfLink:"", UID:"f6f14251-baf9-4c3b-8963-ab140ec1e4a0", ResourceVersion:"1203", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"6d5f899847", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"", Pod:"nginx-deployment-6d5f899847-t8l5b", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.119.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali0d734bb21a0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:32.633513 containerd[1474]: 2025-02-13 16:18:32.553 [INFO][3151] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.119.129/32] ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" Feb 13 16:18:32.633513 containerd[1474]: 2025-02-13 16:18:32.553 [INFO][3151] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0d734bb21a0 ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" Feb 13 16:18:32.633513 containerd[1474]: 2025-02-13 16:18:32.582 [INFO][3151] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" Feb 13 16:18:32.633513 containerd[1474]: 2025-02-13 16:18:32.585 [INFO][3151] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0", GenerateName:"nginx-deployment-6d5f899847-", Namespace:"default", SelfLink:"", UID:"f6f14251-baf9-4c3b-8963-ab140ec1e4a0", ResourceVersion:"1203", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nginx", "pod-template-hash":"6d5f899847", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c", Pod:"nginx-deployment-6d5f899847-t8l5b", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.119.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali0d734bb21a0", MAC:"06:f3:9f:75:9a:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:32.633513 containerd[1474]: 2025-02-13 16:18:32.619 [INFO][3151] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c" Namespace="default" Pod="nginx-deployment-6d5f899847-t8l5b" WorkloadEndpoint="64.227.101.255-k8s-nginx--deployment--6d5f899847--t8l5b-eth0" Feb 13 16:18:32.717332 containerd[1474]: time="2025-02-13T16:18:32.717010649Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:32.717332 containerd[1474]: time="2025-02-13T16:18:32.717101145Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:32.717332 containerd[1474]: time="2025-02-13T16:18:32.717133769Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:32.720553 containerd[1474]: time="2025-02-13T16:18:32.720339272Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:32.744479 systemd-networkd[1377]: califf91e5b3bae: Link UP Feb 13 16:18:32.751598 systemd-networkd[1377]: califf91e5b3bae: Gained carrier Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.052 [INFO][3113] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.215 [INFO][3113] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0 calico-kube-controllers-69d5998dc- calico-system 65a0808a-15de-4f43-bc29-6bb453f1a0be 1326 0 2025-02-13 16:18:30 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:69d5998dc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s 64.227.101.255 calico-kube-controllers-69d5998dc-cpdsh eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] califf91e5b3bae [] []}} ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.215 [INFO][3113] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.402 [INFO][3230] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" HandleID="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Workload="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.432 [INFO][3230] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" HandleID="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Workload="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318480), Attrs:map[string]string{"namespace":"calico-system", "node":"64.227.101.255", "pod":"calico-kube-controllers-69d5998dc-cpdsh", "timestamp":"2025-02-13 16:18:32.402362558 +0000 UTC"}, Hostname:"64.227.101.255", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.433 [INFO][3230] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.543 [INFO][3230] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.544 [INFO][3230] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '64.227.101.255' Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.550 [INFO][3230] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.582 [INFO][3230] ipam/ipam.go 372: Looking up existing affinities for host host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.623 [INFO][3230] ipam/ipam.go 489: Trying affinity for 192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.635 [INFO][3230] ipam/ipam.go 155: Attempting to load block cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.642 [INFO][3230] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.643 [INFO][3230] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.119.128/26 handle="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.652 [INFO][3230] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847 Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.669 [INFO][3230] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.119.128/26 handle="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.709 [INFO][3230] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.119.130/26] block=192.168.119.128/26 handle="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.710 [INFO][3230] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.119.130/26] handle="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" host="64.227.101.255" Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.710 [INFO][3230] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:18:32.804384 containerd[1474]: 2025-02-13 16:18:32.710 [INFO][3230] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.130/26] IPv6=[] ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" HandleID="k8s-pod-network.61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Workload="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" Feb 13 16:18:32.808016 containerd[1474]: 2025-02-13 16:18:32.721 [INFO][3113] cni-plugin/k8s.go 386: Populated endpoint ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0", GenerateName:"calico-kube-controllers-69d5998dc-", Namespace:"calico-system", SelfLink:"", UID:"65a0808a-15de-4f43-bc29-6bb453f1a0be", ResourceVersion:"1326", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69d5998dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"", Pod:"calico-kube-controllers-69d5998dc-cpdsh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califf91e5b3bae", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:32.808016 containerd[1474]: 2025-02-13 16:18:32.721 [INFO][3113] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.119.130/32] ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" Feb 13 16:18:32.808016 containerd[1474]: 2025-02-13 16:18:32.721 [INFO][3113] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califf91e5b3bae ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" Feb 13 16:18:32.808016 containerd[1474]: 2025-02-13 16:18:32.751 [INFO][3113] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" Feb 13 16:18:32.808016 containerd[1474]: 2025-02-13 16:18:32.767 [INFO][3113] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0", GenerateName:"calico-kube-controllers-69d5998dc-", Namespace:"calico-system", SelfLink:"", UID:"65a0808a-15de-4f43-bc29-6bb453f1a0be", ResourceVersion:"1326", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"69d5998dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847", Pod:"calico-kube-controllers-69d5998dc-cpdsh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califf91e5b3bae", MAC:"36:dc:f4:25:e4:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:32.808016 containerd[1474]: 2025-02-13 16:18:32.798 [INFO][3113] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847" Namespace="calico-system" Pod="calico-kube-controllers-69d5998dc-cpdsh" WorkloadEndpoint="64.227.101.255-k8s-calico--kube--controllers--69d5998dc--cpdsh-eth0" Feb 13 16:18:32.839571 systemd[1]: Started cri-containerd-b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c.scope - libcontainer container b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c. Feb 13 16:18:32.903300 systemd-networkd[1377]: cali9c1cf78caa0: Link UP Feb 13 16:18:32.904640 systemd-networkd[1377]: cali9c1cf78caa0: Gained carrier Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:31.969 [INFO][3131] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.075 [INFO][3131] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {64.227.101.255-k8s-csi--node--driver--jlrrr-eth0 csi-node-driver- calico-system 63a56fc8-68aa-4d63-8400-3078bd2ff61f 1104 0 2025-02-13 16:18:02 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b695c467 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s 64.227.101.255 csi-node-driver-jlrrr eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9c1cf78caa0 [] []}} ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.079 [INFO][3131] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.432 [INFO][3206] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" HandleID="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Workload="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.457 [INFO][3206] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" HandleID="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Workload="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00035bc80), Attrs:map[string]string{"namespace":"calico-system", "node":"64.227.101.255", "pod":"csi-node-driver-jlrrr", "timestamp":"2025-02-13 16:18:32.432515487 +0000 UTC"}, Hostname:"64.227.101.255", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.457 [INFO][3206] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.713 [INFO][3206] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.714 [INFO][3206] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '64.227.101.255' Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.724 [INFO][3206] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.766 [INFO][3206] ipam/ipam.go 372: Looking up existing affinities for host host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.796 [INFO][3206] ipam/ipam.go 489: Trying affinity for 192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.818 [INFO][3206] ipam/ipam.go 155: Attempting to load block cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.826 [INFO][3206] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.826 [INFO][3206] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.119.128/26 handle="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.834 [INFO][3206] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.853 [INFO][3206] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.119.128/26 handle="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.873 [INFO][3206] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.119.131/26] block=192.168.119.128/26 handle="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.873 [INFO][3206] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.119.131/26] handle="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" host="64.227.101.255" Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.873 [INFO][3206] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:18:32.956578 containerd[1474]: 2025-02-13 16:18:32.875 [INFO][3206] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.131/26] IPv6=[] ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" HandleID="k8s-pod-network.b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Workload="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" Feb 13 16:18:32.957928 containerd[1474]: 2025-02-13 16:18:32.885 [INFO][3131] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-csi--node--driver--jlrrr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"63a56fc8-68aa-4d63-8400-3078bd2ff61f", ResourceVersion:"1104", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"", Pod:"csi-node-driver-jlrrr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9c1cf78caa0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:32.957928 containerd[1474]: 2025-02-13 16:18:32.885 [INFO][3131] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.119.131/32] ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" Feb 13 16:18:32.957928 containerd[1474]: 2025-02-13 16:18:32.885 [INFO][3131] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c1cf78caa0 ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" Feb 13 16:18:32.957928 containerd[1474]: 2025-02-13 16:18:32.908 [INFO][3131] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" Feb 13 16:18:32.957928 containerd[1474]: 2025-02-13 16:18:32.910 [INFO][3131] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-csi--node--driver--jlrrr-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"63a56fc8-68aa-4d63-8400-3078bd2ff61f", ResourceVersion:"1104", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de", Pod:"csi-node-driver-jlrrr", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9c1cf78caa0", MAC:"8e:57:d8:9c:68:03", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:32.957928 containerd[1474]: 2025-02-13 16:18:32.941 [INFO][3131] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de" Namespace="calico-system" Pod="csi-node-driver-jlrrr" WorkloadEndpoint="64.227.101.255-k8s-csi--node--driver--jlrrr-eth0" Feb 13 16:18:32.974411 containerd[1474]: time="2025-02-13T16:18:32.974108477Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:32.974411 containerd[1474]: time="2025-02-13T16:18:32.974193484Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:32.974411 containerd[1474]: time="2025-02-13T16:18:32.974210190Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:32.974722 containerd[1474]: time="2025-02-13T16:18:32.974411896Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:33.026774 containerd[1474]: time="2025-02-13T16:18:33.026055013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nginx-deployment-6d5f899847-t8l5b,Uid:f6f14251-baf9-4c3b-8963-ab140ec1e4a0,Namespace:default,Attempt:10,} returns sandbox id \"b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c\"" Feb 13 16:18:33.047680 containerd[1474]: time="2025-02-13T16:18:33.047289448Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:33.047680 containerd[1474]: time="2025-02-13T16:18:33.047544360Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:33.047864 containerd[1474]: time="2025-02-13T16:18:33.047636018Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:33.052309 containerd[1474]: time="2025-02-13T16:18:33.049083331Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:33.053710 systemd[1]: Started cri-containerd-61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847.scope - libcontainer container 61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847. Feb 13 16:18:33.108632 systemd[1]: Started cri-containerd-b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de.scope - libcontainer container b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de. Feb 13 16:18:33.142497 kubelet[1788]: E0213 16:18:33.142446 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:33.176579 containerd[1474]: time="2025-02-13T16:18:33.176526109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-69d5998dc-cpdsh,Uid:65a0808a-15de-4f43-bc29-6bb453f1a0be,Namespace:calico-system,Attempt:1,} returns sandbox id \"61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847\"" Feb 13 16:18:33.195178 containerd[1474]: time="2025-02-13T16:18:33.195090833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-jlrrr,Uid:63a56fc8-68aa-4d63-8400-3078bd2ff61f,Namespace:calico-system,Attempt:11,} returns sandbox id \"b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de\"" Feb 13 16:18:33.322550 systemd[1]: cri-containerd-b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5.scope: Deactivated successfully. Feb 13 16:18:33.323018 systemd[1]: cri-containerd-b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5.scope: Consumed 1.026s CPU time. Feb 13 16:18:33.509092 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5-rootfs.mount: Deactivated successfully. Feb 13 16:18:33.523791 containerd[1474]: time="2025-02-13T16:18:33.523681822Z" level=info msg="shim disconnected" id=b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5 namespace=k8s.io Feb 13 16:18:33.523791 containerd[1474]: time="2025-02-13T16:18:33.523786803Z" level=warning msg="cleaning up after shim disconnected" id=b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5 namespace=k8s.io Feb 13 16:18:33.523791 containerd[1474]: time="2025-02-13T16:18:33.523801216Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:18:33.555583 systemd[1]: Started sshd@13-64.227.101.255:22-165.22.176.90:55136.service - OpenSSH per-connection server daemon (165.22.176.90:55136). Feb 13 16:18:33.626260 containerd[1474]: time="2025-02-13T16:18:33.626159757Z" level=info msg="StopContainer for \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\" returns successfully" Feb 13 16:18:33.628778 containerd[1474]: time="2025-02-13T16:18:33.628627207Z" level=info msg="StopPodSandbox for \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\"" Feb 13 16:18:33.629952 containerd[1474]: time="2025-02-13T16:18:33.629037375Z" level=info msg="Container to stop \"e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 13 16:18:33.629952 containerd[1474]: time="2025-02-13T16:18:33.629687182Z" level=info msg="Container to stop \"d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 13 16:18:33.629952 containerd[1474]: time="2025-02-13T16:18:33.629872489Z" level=info msg="Container to stop \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Feb 13 16:18:33.639195 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8-shm.mount: Deactivated successfully. Feb 13 16:18:33.658065 systemd[1]: cri-containerd-396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8.scope: Deactivated successfully. Feb 13 16:18:33.735424 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8-rootfs.mount: Deactivated successfully. Feb 13 16:18:33.768209 containerd[1474]: time="2025-02-13T16:18:33.767720920Z" level=info msg="shim disconnected" id=396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8 namespace=k8s.io Feb 13 16:18:33.768209 containerd[1474]: time="2025-02-13T16:18:33.767820008Z" level=warning msg="cleaning up after shim disconnected" id=396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8 namespace=k8s.io Feb 13 16:18:33.768209 containerd[1474]: time="2025-02-13T16:18:33.767832385Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:18:33.812751 containerd[1474]: time="2025-02-13T16:18:33.811898942Z" level=info msg="TearDown network for sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" successfully" Feb 13 16:18:33.812751 containerd[1474]: time="2025-02-13T16:18:33.811949874Z" level=info msg="StopPodSandbox for \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" returns successfully" Feb 13 16:18:33.861481 systemd[1]: Started sshd@14-64.227.101.255:22-103.91.136.18:40377.service - OpenSSH per-connection server daemon (103.91.136.18:40377). Feb 13 16:18:33.925164 kubelet[1788]: I0213 16:18:33.923788 1788 topology_manager.go:215] "Topology Admit Handler" podUID="d7c96831-edfa-4b33-85af-e6fa72d95be9" podNamespace="calico-system" podName="calico-node-ck2p2" Feb 13 16:18:33.925164 kubelet[1788]: E0213 16:18:33.923886 1788 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="ce202485-6d0f-47ce-8917-54c286d3eb4b" containerName="calico-node" Feb 13 16:18:33.925164 kubelet[1788]: E0213 16:18:33.923908 1788 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="ce202485-6d0f-47ce-8917-54c286d3eb4b" containerName="flexvol-driver" Feb 13 16:18:33.925164 kubelet[1788]: E0213 16:18:33.923920 1788 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="ce202485-6d0f-47ce-8917-54c286d3eb4b" containerName="install-cni" Feb 13 16:18:33.925164 kubelet[1788]: I0213 16:18:33.923955 1788 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce202485-6d0f-47ce-8917-54c286d3eb4b" containerName="calico-node" Feb 13 16:18:33.937046 systemd-networkd[1377]: cali0d734bb21a0: Gained IPv6LL Feb 13 16:18:33.939524 systemd[1]: Created slice kubepods-besteffort-podd7c96831_edfa_4b33_85af_e6fa72d95be9.slice - libcontainer container kubepods-besteffort-podd7c96831_edfa_4b33_85af_e6fa72d95be9.slice. Feb 13 16:18:33.957255 kubelet[1788]: I0213 16:18:33.954450 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-bin-dir\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957255 kubelet[1788]: I0213 16:18:33.954544 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-log-dir\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957255 kubelet[1788]: I0213 16:18:33.954589 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-run-calico\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957255 kubelet[1788]: I0213 16:18:33.954636 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-xtables-lock\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957255 kubelet[1788]: I0213 16:18:33.954676 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ce202485-6d0f-47ce-8917-54c286d3eb4b-node-certs\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957255 kubelet[1788]: I0213 16:18:33.954724 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-lib-calico\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957898 kubelet[1788]: I0213 16:18:33.954768 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jglp2\" (UniqueName: \"kubernetes.io/projected/ce202485-6d0f-47ce-8917-54c286d3eb4b-kube-api-access-jglp2\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957898 kubelet[1788]: I0213 16:18:33.954805 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-flexvol-driver-host\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957898 kubelet[1788]: I0213 16:18:33.954850 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce202485-6d0f-47ce-8917-54c286d3eb4b-tigera-ca-bundle\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957898 kubelet[1788]: I0213 16:18:33.954905 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-net-dir\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957898 kubelet[1788]: I0213 16:18:33.954940 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-policysync\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.957898 kubelet[1788]: I0213 16:18:33.955167 1788 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-lib-modules\") pod \"ce202485-6d0f-47ce-8917-54c286d3eb4b\" (UID: \"ce202485-6d0f-47ce-8917-54c286d3eb4b\") " Feb 13 16:18:33.958163 kubelet[1788]: I0213 16:18:33.955800 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.958163 kubelet[1788]: I0213 16:18:33.955984 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.958163 kubelet[1788]: I0213 16:18:33.956027 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.958163 kubelet[1788]: I0213 16:18:33.956097 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.958163 kubelet[1788]: I0213 16:18:33.956165 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.980132 kubelet[1788]: I0213 16:18:33.980062 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.980136 systemd[1]: var-lib-kubelet-pods-ce202485\x2d6d0f\x2d47ce\x2d8917\x2d54c286d3eb4b-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Feb 13 16:18:33.982467 kubelet[1788]: I0213 16:18:33.982413 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.984435 kubelet[1788]: I0213 16:18:33.983791 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-policysync" (OuterVolumeSpecName: "policysync") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.984435 kubelet[1788]: I0213 16:18:33.983969 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 13 16:18:33.985020 kubelet[1788]: I0213 16:18:33.984818 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce202485-6d0f-47ce-8917-54c286d3eb4b-node-certs" (OuterVolumeSpecName: "node-certs") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 13 16:18:33.996011 systemd[1]: var-lib-kubelet-pods-ce202485\x2d6d0f\x2d47ce\x2d8917\x2d54c286d3eb4b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2djglp2.mount: Deactivated successfully. Feb 13 16:18:33.997381 kubelet[1788]: I0213 16:18:33.997321 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce202485-6d0f-47ce-8917-54c286d3eb4b-kube-api-access-jglp2" (OuterVolumeSpecName: "kube-api-access-jglp2") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "kube-api-access-jglp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 13 16:18:34.003641 kubelet[1788]: I0213 16:18:34.003154 1788 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce202485-6d0f-47ce-8917-54c286d3eb4b-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "ce202485-6d0f-47ce-8917-54c286d3eb4b" (UID: "ce202485-6d0f-47ce-8917-54c286d3eb4b"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 13 16:18:34.058298 kubelet[1788]: I0213 16:18:34.056484 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-flexvol-driver-host\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058298 kubelet[1788]: I0213 16:18:34.056568 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-var-run-calico\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058298 kubelet[1788]: I0213 16:18:34.056604 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-xtables-lock\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058298 kubelet[1788]: I0213 16:18:34.056638 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7c96831-edfa-4b33-85af-e6fa72d95be9-tigera-ca-bundle\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058298 kubelet[1788]: I0213 16:18:34.056668 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-cni-log-dir\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058644 kubelet[1788]: I0213 16:18:34.056703 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-cni-net-dir\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058644 kubelet[1788]: I0213 16:18:34.056733 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-lib-modules\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058644 kubelet[1788]: I0213 16:18:34.056763 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-policysync\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058644 kubelet[1788]: I0213 16:18:34.056797 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-var-lib-calico\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058644 kubelet[1788]: I0213 16:18:34.056836 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjs56\" (UniqueName: \"kubernetes.io/projected/d7c96831-edfa-4b33-85af-e6fa72d95be9-kube-api-access-cjs56\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058868 kubelet[1788]: I0213 16:18:34.056874 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d7c96831-edfa-4b33-85af-e6fa72d95be9-node-certs\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058868 kubelet[1788]: I0213 16:18:34.056911 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d7c96831-edfa-4b33-85af-e6fa72d95be9-cni-bin-dir\") pod \"calico-node-ck2p2\" (UID: \"d7c96831-edfa-4b33-85af-e6fa72d95be9\") " pod="calico-system/calico-node-ck2p2" Feb 13 16:18:34.058868 kubelet[1788]: I0213 16:18:34.058317 1788 reconciler_common.go:300] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-lib-calico\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.058868 kubelet[1788]: I0213 16:18:34.058358 1788 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-jglp2\" (UniqueName: \"kubernetes.io/projected/ce202485-6d0f-47ce-8917-54c286d3eb4b-kube-api-access-jglp2\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.058868 kubelet[1788]: I0213 16:18:34.058376 1788 reconciler_common.go:300] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-flexvol-driver-host\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.058868 kubelet[1788]: I0213 16:18:34.058392 1788 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce202485-6d0f-47ce-8917-54c286d3eb4b-tigera-ca-bundle\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.058868 kubelet[1788]: I0213 16:18:34.058410 1788 reconciler_common.go:300] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-net-dir\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.059191 kubelet[1788]: I0213 16:18:34.058427 1788 reconciler_common.go:300] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-policysync\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.059191 kubelet[1788]: I0213 16:18:34.058442 1788 reconciler_common.go:300] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-lib-modules\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.059191 kubelet[1788]: I0213 16:18:34.058457 1788 reconciler_common.go:300] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-xtables-lock\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.059191 kubelet[1788]: I0213 16:18:34.058473 1788 reconciler_common.go:300] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-bin-dir\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.059191 kubelet[1788]: I0213 16:18:34.058491 1788 reconciler_common.go:300] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-cni-log-dir\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.059191 kubelet[1788]: I0213 16:18:34.058508 1788 reconciler_common.go:300] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ce202485-6d0f-47ce-8917-54c286d3eb4b-var-run-calico\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.059191 kubelet[1788]: I0213 16:18:34.058569 1788 reconciler_common.go:300] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ce202485-6d0f-47ce-8917-54c286d3eb4b-node-certs\") on node \"64.227.101.255\" DevicePath \"\"" Feb 13 16:18:34.140531 sshd[3504]: Invalid user work from 165.22.176.90 port 55136 Feb 13 16:18:34.144077 kubelet[1788]: E0213 16:18:34.143642 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:34.226974 sshd[3504]: Received disconnect from 165.22.176.90 port 55136:11: Bye Bye [preauth] Feb 13 16:18:34.226974 sshd[3504]: Disconnected from invalid user work 165.22.176.90 port 55136 [preauth] Feb 13 16:18:34.231115 systemd[1]: sshd@13-64.227.101.255:22-165.22.176.90:55136.service: Deactivated successfully. Feb 13 16:18:34.269184 kubelet[1788]: E0213 16:18:34.268777 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:34.271252 containerd[1474]: time="2025-02-13T16:18:34.271162389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ck2p2,Uid:d7c96831-edfa-4b33-85af-e6fa72d95be9,Namespace:calico-system,Attempt:0,}" Feb 13 16:18:34.317801 containerd[1474]: time="2025-02-13T16:18:34.317119768Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:34.317946 containerd[1474]: time="2025-02-13T16:18:34.317400403Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:34.317946 containerd[1474]: time="2025-02-13T16:18:34.317434329Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:34.317946 containerd[1474]: time="2025-02-13T16:18:34.317594794Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:34.356186 systemd[1]: Started cri-containerd-3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd.scope - libcontainer container 3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd. Feb 13 16:18:34.373913 systemd[1]: Removed slice kubepods-besteffort-podce202485_6d0f_47ce_8917_54c286d3eb4b.slice - libcontainer container kubepods-besteffort-podce202485_6d0f_47ce_8917_54c286d3eb4b.slice. Feb 13 16:18:34.374467 systemd[1]: kubepods-besteffort-podce202485_6d0f_47ce_8917_54c286d3eb4b.slice: Consumed 1.972s CPU time. Feb 13 16:18:34.382998 systemd-networkd[1377]: califf91e5b3bae: Gained IPv6LL Feb 13 16:18:34.445133 containerd[1474]: time="2025-02-13T16:18:34.445000494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ck2p2,Uid:d7c96831-edfa-4b33-85af-e6fa72d95be9,Namespace:calico-system,Attempt:0,} returns sandbox id \"3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd\"" Feb 13 16:18:34.447584 kubelet[1788]: E0213 16:18:34.447321 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:34.452841 containerd[1474]: time="2025-02-13T16:18:34.452607762Z" level=info msg="CreateContainer within sandbox \"3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 16:18:34.488311 containerd[1474]: time="2025-02-13T16:18:34.488030450Z" level=info msg="CreateContainer within sandbox \"3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3\"" Feb 13 16:18:34.491260 containerd[1474]: time="2025-02-13T16:18:34.490694035Z" level=info msg="StartContainer for \"3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3\"" Feb 13 16:18:34.517161 systemd[1]: var-lib-kubelet-pods-ce202485\x2d6d0f\x2d47ce\x2d8917\x2d54c286d3eb4b-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Feb 13 16:18:34.536699 systemd[1]: Started sshd@15-64.227.101.255:22-186.10.86.130:33304.service - OpenSSH per-connection server daemon (186.10.86.130:33304). Feb 13 16:18:34.599201 systemd[1]: run-containerd-runc-k8s.io-3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3-runc.bH7AnY.mount: Deactivated successfully. Feb 13 16:18:34.625843 systemd[1]: Started cri-containerd-3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3.scope - libcontainer container 3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3. Feb 13 16:18:34.629133 systemd[1]: Started sshd@16-64.227.101.255:22-68.154.41.253:52164.service - OpenSSH per-connection server daemon (68.154.41.253:52164). Feb 13 16:18:34.724774 containerd[1474]: time="2025-02-13T16:18:34.723839534Z" level=info msg="StartContainer for \"3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3\" returns successfully" Feb 13 16:18:34.765638 systemd[1]: cri-containerd-3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3.scope: Deactivated successfully. Feb 13 16:18:34.826935 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3-rootfs.mount: Deactivated successfully. Feb 13 16:18:34.958011 kubelet[1788]: I0213 16:18:34.888045 1788 scope.go:117] "RemoveContainer" containerID="b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5" Feb 13 16:18:34.958011 kubelet[1788]: E0213 16:18:34.897014 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:34.898419 systemd-networkd[1377]: cali9c1cf78caa0: Gained IPv6LL Feb 13 16:18:34.966156 containerd[1474]: time="2025-02-13T16:18:34.964266582Z" level=info msg="RemoveContainer for \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\"" Feb 13 16:18:34.977442 containerd[1474]: time="2025-02-13T16:18:34.976537852Z" level=info msg="RemoveContainer for \"b12e6134ac9aa5186baeb26f958bbc1438b8ad8bf2561c446fcf2047e552bbc5\" returns successfully" Feb 13 16:18:34.977442 containerd[1474]: time="2025-02-13T16:18:34.976922646Z" level=info msg="shim disconnected" id=3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3 namespace=k8s.io Feb 13 16:18:34.977442 containerd[1474]: time="2025-02-13T16:18:34.976975747Z" level=warning msg="cleaning up after shim disconnected" id=3600921f15ff1c17482cdd6e1846b2c8ac5133f70e1430f4378ed6300f1a74b3 namespace=k8s.io Feb 13 16:18:34.977442 containerd[1474]: time="2025-02-13T16:18:34.976999218Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:18:34.978954 kubelet[1788]: I0213 16:18:34.978535 1788 scope.go:117] "RemoveContainer" containerID="d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc" Feb 13 16:18:34.988016 containerd[1474]: time="2025-02-13T16:18:34.987508112Z" level=info msg="RemoveContainer for \"d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc\"" Feb 13 16:18:35.007988 containerd[1474]: time="2025-02-13T16:18:35.007904873Z" level=info msg="RemoveContainer for \"d7f508028c3aedb6aefda8e4eeeec0b63dab28595df558bba9958a059fb372dc\" returns successfully" Feb 13 16:18:35.008683 kubelet[1788]: I0213 16:18:35.008460 1788 scope.go:117] "RemoveContainer" containerID="e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6" Feb 13 16:18:35.013055 containerd[1474]: time="2025-02-13T16:18:35.012202352Z" level=info msg="RemoveContainer for \"e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6\"" Feb 13 16:18:35.031244 containerd[1474]: time="2025-02-13T16:18:35.031154792Z" level=info msg="RemoveContainer for \"e52cc9c8b7426ffa28d3fd68e10b8187d5c28aa68352d0cc1a529cc9f5371ba6\" returns successfully" Feb 13 16:18:35.094048 sshd[3617]: Invalid user matheus from 68.154.41.253 port 52164 Feb 13 16:18:35.145138 kubelet[1788]: E0213 16:18:35.145062 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:35.162873 sshd[3617]: Received disconnect from 68.154.41.253 port 52164:11: Bye Bye [preauth] Feb 13 16:18:35.162873 sshd[3617]: Disconnected from invalid user matheus 68.154.41.253 port 52164 [preauth] Feb 13 16:18:35.166300 systemd[1]: sshd@16-64.227.101.255:22-68.154.41.253:52164.service: Deactivated successfully. Feb 13 16:18:35.204344 containerd[1474]: time="2025-02-13T16:18:35.204179314Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:35.205659 containerd[1474]: time="2025-02-13T16:18:35.205583087Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=29850141" Feb 13 16:18:35.205659 containerd[1474]: time="2025-02-13T16:18:35.205942221Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:35.209658 containerd[1474]: time="2025-02-13T16:18:35.209500043Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:35.211316 containerd[1474]: time="2025-02-13T16:18:35.211214765Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 4.074229059s" Feb 13 16:18:35.211498 containerd[1474]: time="2025-02-13T16:18:35.211478128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Feb 13 16:18:35.213678 containerd[1474]: time="2025-02-13T16:18:35.212506473Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 16:18:35.225314 containerd[1474]: time="2025-02-13T16:18:35.225140818Z" level=info msg="CreateContainer within sandbox \"7c02d8043a415d78039ea4042a791c4a12ca3746e8b16d2d499b855c8de93d8a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 16:18:35.240192 sshd[3544]: Invalid user alex from 103.91.136.18 port 40377 Feb 13 16:18:35.266290 containerd[1474]: time="2025-02-13T16:18:35.265349760Z" level=info msg="CreateContainer within sandbox \"7c02d8043a415d78039ea4042a791c4a12ca3746e8b16d2d499b855c8de93d8a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e500daa4473a21843bea88fe87e154d177a3e33409e14bfafdb53d63d5ab4c89\"" Feb 13 16:18:35.266748 containerd[1474]: time="2025-02-13T16:18:35.266677638Z" level=info msg="StartContainer for \"e500daa4473a21843bea88fe87e154d177a3e33409e14bfafdb53d63d5ab4c89\"" Feb 13 16:18:35.319852 systemd[1]: Started cri-containerd-e500daa4473a21843bea88fe87e154d177a3e33409e14bfafdb53d63d5ab4c89.scope - libcontainer container e500daa4473a21843bea88fe87e154d177a3e33409e14bfafdb53d63d5ab4c89. Feb 13 16:18:35.389685 containerd[1474]: time="2025-02-13T16:18:35.389588978Z" level=info msg="StartContainer for \"e500daa4473a21843bea88fe87e154d177a3e33409e14bfafdb53d63d5ab4c89\" returns successfully" Feb 13 16:18:35.512450 sshd[3544]: Received disconnect from 103.91.136.18 port 40377:11: Bye Bye [preauth] Feb 13 16:18:35.512450 sshd[3544]: Disconnected from invalid user alex 103.91.136.18 port 40377 [preauth] Feb 13 16:18:35.520947 systemd[1]: sshd@14-64.227.101.255:22-103.91.136.18:40377.service: Deactivated successfully. Feb 13 16:18:35.589883 sshd[3604]: Invalid user odoo from 186.10.86.130 port 33304 Feb 13 16:18:35.786788 sshd[3604]: Received disconnect from 186.10.86.130 port 33304:11: Bye Bye [preauth] Feb 13 16:18:35.788302 sshd[3604]: Disconnected from invalid user odoo 186.10.86.130 port 33304 [preauth] Feb 13 16:18:35.789143 systemd[1]: sshd@15-64.227.101.255:22-186.10.86.130:33304.service: Deactivated successfully. Feb 13 16:18:35.908595 kubelet[1788]: E0213 16:18:35.908370 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:35.922321 kubelet[1788]: E0213 16:18:35.921025 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:35.925395 containerd[1474]: time="2025-02-13T16:18:35.925349095Z" level=info msg="CreateContainer within sandbox \"3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 16:18:35.983455 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3547411522.mount: Deactivated successfully. Feb 13 16:18:35.987993 kubelet[1788]: I0213 16:18:35.987930 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-5856dc5d7f-4qnh9" podStartSLOduration=2.241755316 podStartE2EDuration="8.987878074s" podCreationTimestamp="2025-02-13 16:18:27 +0000 UTC" firstStartedPulling="2025-02-13 16:18:28.466005804 +0000 UTC m=+27.160983686" lastFinishedPulling="2025-02-13 16:18:35.212128552 +0000 UTC m=+33.907106444" observedRunningTime="2025-02-13 16:18:35.948084777 +0000 UTC m=+34.643082198" watchObservedRunningTime="2025-02-13 16:18:35.987878074 +0000 UTC m=+34.682855948" Feb 13 16:18:35.993667 containerd[1474]: time="2025-02-13T16:18:35.991482034Z" level=info msg="CreateContainer within sandbox \"3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657\"" Feb 13 16:18:35.994877 containerd[1474]: time="2025-02-13T16:18:35.994814839Z" level=info msg="StartContainer for \"0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657\"" Feb 13 16:18:36.073888 systemd[1]: Started cri-containerd-0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657.scope - libcontainer container 0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657. Feb 13 16:18:36.146174 kubelet[1788]: E0213 16:18:36.145602 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:36.169277 containerd[1474]: time="2025-02-13T16:18:36.169155911Z" level=info msg="StartContainer for \"0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657\" returns successfully" Feb 13 16:18:36.277854 systemd[1]: Started sshd@17-64.227.101.255:22-94.182.88.214:54036.service - OpenSSH per-connection server daemon (94.182.88.214:54036). Feb 13 16:18:36.361070 kubelet[1788]: I0213 16:18:36.359750 1788 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="ce202485-6d0f-47ce-8917-54c286d3eb4b" path="/var/lib/kubelet/pods/ce202485-6d0f-47ce-8917-54c286d3eb4b/volumes" Feb 13 16:18:36.513860 systemd[1]: run-containerd-runc-k8s.io-0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657-runc.eY13Ev.mount: Deactivated successfully. Feb 13 16:18:36.930362 kubelet[1788]: I0213 16:18:36.930307 1788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:18:36.932804 kubelet[1788]: E0213 16:18:36.932763 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:36.933840 kubelet[1788]: E0213 16:18:36.933690 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:37.146090 kubelet[1788]: E0213 16:18:37.146000 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:37.388065 systemd[1]: cri-containerd-0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657.scope: Deactivated successfully. Feb 13 16:18:37.469624 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657-rootfs.mount: Deactivated successfully. Feb 13 16:18:37.518574 sshd[3745]: Invalid user haichao from 94.182.88.214 port 54036 Feb 13 16:18:37.520703 containerd[1474]: time="2025-02-13T16:18:37.520073908Z" level=info msg="shim disconnected" id=0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657 namespace=k8s.io Feb 13 16:18:37.520703 containerd[1474]: time="2025-02-13T16:18:37.520172176Z" level=warning msg="cleaning up after shim disconnected" id=0fef97a8c3d5289b9a8a82f30b57d63e874e9aff011c162c66290a2644749657 namespace=k8s.io Feb 13 16:18:37.520703 containerd[1474]: time="2025-02-13T16:18:37.520185339Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:18:37.748464 sshd[3745]: Received disconnect from 94.182.88.214 port 54036:11: Bye Bye [preauth] Feb 13 16:18:37.748464 sshd[3745]: Disconnected from invalid user haichao 94.182.88.214 port 54036 [preauth] Feb 13 16:18:37.751472 systemd[1]: sshd@17-64.227.101.255:22-94.182.88.214:54036.service: Deactivated successfully. Feb 13 16:18:37.942397 kubelet[1788]: E0213 16:18:37.942354 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:37.979356 containerd[1474]: time="2025-02-13T16:18:37.978549429Z" level=info msg="CreateContainer within sandbox \"3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 16:18:38.015403 containerd[1474]: time="2025-02-13T16:18:38.015168596Z" level=info msg="CreateContainer within sandbox \"3d043d51faa2890ff5b9c5fea6ccdeef8fc7a08a7490d5da360c9adfc77656dd\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4d53c0c36f456a0688db1d1e608fd6f89a5856e7d7bd716c741001b91f41030c\"" Feb 13 16:18:38.016784 containerd[1474]: time="2025-02-13T16:18:38.016735459Z" level=info msg="StartContainer for \"4d53c0c36f456a0688db1d1e608fd6f89a5856e7d7bd716c741001b91f41030c\"" Feb 13 16:18:38.093592 systemd[1]: Started cri-containerd-4d53c0c36f456a0688db1d1e608fd6f89a5856e7d7bd716c741001b91f41030c.scope - libcontainer container 4d53c0c36f456a0688db1d1e608fd6f89a5856e7d7bd716c741001b91f41030c. Feb 13 16:18:38.147934 kubelet[1788]: E0213 16:18:38.147089 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:38.180199 containerd[1474]: time="2025-02-13T16:18:38.180096233Z" level=info msg="StartContainer for \"4d53c0c36f456a0688db1d1e608fd6f89a5856e7d7bd716c741001b91f41030c\" returns successfully" Feb 13 16:18:38.701263 systemd[1]: Started sshd@18-64.227.101.255:22-42.200.66.164:44936.service - OpenSSH per-connection server daemon (42.200.66.164:44936). Feb 13 16:18:38.949790 kubelet[1788]: E0213 16:18:38.949089 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:38.971078 kubelet[1788]: I0213 16:18:38.970927 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-ck2p2" podStartSLOduration=5.970854437 podStartE2EDuration="5.970854437s" podCreationTimestamp="2025-02-13 16:18:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:18:38.970378411 +0000 UTC m=+37.665356297" watchObservedRunningTime="2025-02-13 16:18:38.970854437 +0000 UTC m=+37.665832486" Feb 13 16:18:39.147538 kubelet[1788]: E0213 16:18:39.147486 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:39.636419 sshd[3838]: Invalid user zhangsan from 42.200.66.164 port 44936 Feb 13 16:18:39.806691 sshd[3838]: Received disconnect from 42.200.66.164 port 44936:11: Bye Bye [preauth] Feb 13 16:18:39.810301 sshd[3838]: Disconnected from invalid user zhangsan 42.200.66.164 port 44936 [preauth] Feb 13 16:18:39.811858 systemd[1]: sshd@18-64.227.101.255:22-42.200.66.164:44936.service: Deactivated successfully. Feb 13 16:18:39.883658 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1479738475.mount: Deactivated successfully. Feb 13 16:18:39.957771 kubelet[1788]: E0213 16:18:39.956427 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:40.147677 kubelet[1788]: E0213 16:18:40.147617 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:41.148552 kubelet[1788]: E0213 16:18:41.148490 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:42.079361 containerd[1474]: time="2025-02-13T16:18:42.079287945Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:42.081599 containerd[1474]: time="2025-02-13T16:18:42.081539339Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=73054493" Feb 13 16:18:42.084403 containerd[1474]: time="2025-02-13T16:18:42.084343294Z" level=info msg="ImageCreate event name:\"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:42.097501 containerd[1474]: time="2025-02-13T16:18:42.097390742Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:42.099428 containerd[1474]: time="2025-02-13T16:18:42.099347935Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 6.886794137s" Feb 13 16:18:42.099428 containerd[1474]: time="2025-02-13T16:18:42.099422112Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 16:18:42.105276 containerd[1474]: time="2025-02-13T16:18:42.103558216Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Feb 13 16:18:42.106729 containerd[1474]: time="2025-02-13T16:18:42.106672205Z" level=info msg="CreateContainer within sandbox \"b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c\" for container &ContainerMetadata{Name:nginx,Attempt:0,}" Feb 13 16:18:42.107006 kubelet[1788]: E0213 16:18:42.106956 1788 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:42.131689 containerd[1474]: time="2025-02-13T16:18:42.131515052Z" level=info msg="CreateContainer within sandbox \"b696b60686de5f340e8cf4eb02a5fb491ab60230817de3b951c552cb383e946c\" for &ContainerMetadata{Name:nginx,Attempt:0,} returns container id \"0ff67695c1a4a8d182516f9adc7d541ba5d3c9d479891e851c6065f0afdae397\"" Feb 13 16:18:42.133740 containerd[1474]: time="2025-02-13T16:18:42.132621900Z" level=info msg="StartContainer for \"0ff67695c1a4a8d182516f9adc7d541ba5d3c9d479891e851c6065f0afdae397\"" Feb 13 16:18:42.150291 kubelet[1788]: E0213 16:18:42.149363 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:42.187585 systemd[1]: run-containerd-runc-k8s.io-0ff67695c1a4a8d182516f9adc7d541ba5d3c9d479891e851c6065f0afdae397-runc.8bTKT0.mount: Deactivated successfully. Feb 13 16:18:42.200644 systemd[1]: Started cri-containerd-0ff67695c1a4a8d182516f9adc7d541ba5d3c9d479891e851c6065f0afdae397.scope - libcontainer container 0ff67695c1a4a8d182516f9adc7d541ba5d3c9d479891e851c6065f0afdae397. Feb 13 16:18:42.268697 containerd[1474]: time="2025-02-13T16:18:42.268566660Z" level=info msg="StartContainer for \"0ff67695c1a4a8d182516f9adc7d541ba5d3c9d479891e851c6065f0afdae397\" returns successfully" Feb 13 16:18:42.448722 systemd[1]: Started sshd@19-64.227.101.255:22-15.204.59.193:46478.service - OpenSSH per-connection server daemon (15.204.59.193:46478). Feb 13 16:18:42.686813 sshd[4098]: Invalid user clouduser from 15.204.59.193 port 46478 Feb 13 16:18:42.708184 sshd[4098]: Received disconnect from 15.204.59.193 port 46478:11: Bye Bye [preauth] Feb 13 16:18:42.708184 sshd[4098]: Disconnected from invalid user clouduser 15.204.59.193 port 46478 [preauth] Feb 13 16:18:42.710821 systemd[1]: sshd@19-64.227.101.255:22-15.204.59.193:46478.service: Deactivated successfully. Feb 13 16:18:43.153311 kubelet[1788]: E0213 16:18:43.153092 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:43.531415 systemd[1]: Started sshd@20-64.227.101.255:22-175.207.13.86:47948.service - OpenSSH per-connection server daemon (175.207.13.86:47948). Feb 13 16:18:44.153664 kubelet[1788]: E0213 16:18:44.153590 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:44.701165 sshd[4119]: Invalid user alex from 175.207.13.86 port 47948 Feb 13 16:18:44.838070 sshd[4119]: Received disconnect from 175.207.13.86 port 47948:11: Bye Bye [preauth] Feb 13 16:18:44.838070 sshd[4119]: Disconnected from invalid user alex 175.207.13.86 port 47948 [preauth] Feb 13 16:18:44.842509 systemd[1]: sshd@20-64.227.101.255:22-175.207.13.86:47948.service: Deactivated successfully. Feb 13 16:18:44.889292 containerd[1474]: time="2025-02-13T16:18:44.888365399Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:44.897364 containerd[1474]: time="2025-02-13T16:18:44.892999624Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Feb 13 16:18:44.897364 containerd[1474]: time="2025-02-13T16:18:44.893691082Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:44.926221 containerd[1474]: time="2025-02-13T16:18:44.926122593Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.82248311s" Feb 13 16:18:44.926445 containerd[1474]: time="2025-02-13T16:18:44.926412757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Feb 13 16:18:44.926535 containerd[1474]: time="2025-02-13T16:18:44.926365913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:44.930614 containerd[1474]: time="2025-02-13T16:18:44.930550549Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 16:18:44.959848 containerd[1474]: time="2025-02-13T16:18:44.959649299Z" level=info msg="CreateContainer within sandbox \"61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Feb 13 16:18:44.989306 containerd[1474]: time="2025-02-13T16:18:44.989169248Z" level=info msg="CreateContainer within sandbox \"61ec4eac883bded2aff1d103ec1c787c2184fed656a7c3c2cf82598391252847\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"fd79b4c07152af78e550ed084e80f6f40f18af5f3465346f3e29a884ac9bcc1b\"" Feb 13 16:18:44.990890 containerd[1474]: time="2025-02-13T16:18:44.990705389Z" level=info msg="StartContainer for \"fd79b4c07152af78e550ed084e80f6f40f18af5f3465346f3e29a884ac9bcc1b\"" Feb 13 16:18:45.046571 systemd[1]: Started cri-containerd-fd79b4c07152af78e550ed084e80f6f40f18af5f3465346f3e29a884ac9bcc1b.scope - libcontainer container fd79b4c07152af78e550ed084e80f6f40f18af5f3465346f3e29a884ac9bcc1b. Feb 13 16:18:45.135578 containerd[1474]: time="2025-02-13T16:18:45.135382622Z" level=info msg="StartContainer for \"fd79b4c07152af78e550ed084e80f6f40f18af5f3465346f3e29a884ac9bcc1b\" returns successfully" Feb 13 16:18:45.153862 kubelet[1788]: E0213 16:18:45.153788 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:46.154861 kubelet[1788]: E0213 16:18:46.154794 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:46.159161 kubelet[1788]: I0213 16:18:46.159107 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="default/nginx-deployment-6d5f899847-t8l5b" podStartSLOduration=16.089806008 podStartE2EDuration="25.159040787s" podCreationTimestamp="2025-02-13 16:18:21 +0000 UTC" firstStartedPulling="2025-02-13 16:18:33.03330677 +0000 UTC m=+31.728284650" lastFinishedPulling="2025-02-13 16:18:42.102541537 +0000 UTC m=+40.797519429" observedRunningTime="2025-02-13 16:18:42.995015185 +0000 UTC m=+41.689993077" watchObservedRunningTime="2025-02-13 16:18:46.159040787 +0000 UTC m=+44.854018682" Feb 13 16:18:46.557308 containerd[1474]: time="2025-02-13T16:18:46.556939299Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:46.559278 containerd[1474]: time="2025-02-13T16:18:46.559060450Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 16:18:46.560453 containerd[1474]: time="2025-02-13T16:18:46.560362454Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:46.563529 containerd[1474]: time="2025-02-13T16:18:46.563434155Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:46.565143 containerd[1474]: time="2025-02-13T16:18:46.564218092Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.633615206s" Feb 13 16:18:46.565143 containerd[1474]: time="2025-02-13T16:18:46.564290571Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 16:18:46.567215 containerd[1474]: time="2025-02-13T16:18:46.567161586Z" level=info msg="CreateContainer within sandbox \"b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 16:18:46.632911 containerd[1474]: time="2025-02-13T16:18:46.632667115Z" level=info msg="CreateContainer within sandbox \"b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"19cdaa95cdd5b5ccc30c750bd5f54314a3b52aa7058cf94e1a17fe1a8a868ef1\"" Feb 13 16:18:46.633884 containerd[1474]: time="2025-02-13T16:18:46.633814262Z" level=info msg="StartContainer for \"19cdaa95cdd5b5ccc30c750bd5f54314a3b52aa7058cf94e1a17fe1a8a868ef1\"" Feb 13 16:18:46.692633 systemd[1]: Started cri-containerd-19cdaa95cdd5b5ccc30c750bd5f54314a3b52aa7058cf94e1a17fe1a8a868ef1.scope - libcontainer container 19cdaa95cdd5b5ccc30c750bd5f54314a3b52aa7058cf94e1a17fe1a8a868ef1. Feb 13 16:18:46.754576 containerd[1474]: time="2025-02-13T16:18:46.754489502Z" level=info msg="StartContainer for \"19cdaa95cdd5b5ccc30c750bd5f54314a3b52aa7058cf94e1a17fe1a8a868ef1\" returns successfully" Feb 13 16:18:46.757850 containerd[1474]: time="2025-02-13T16:18:46.757802811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 16:18:47.089984 systemd[1]: Started sshd@21-64.227.101.255:22-103.10.44.105:58290.service - OpenSSH per-connection server daemon (103.10.44.105:58290). Feb 13 16:18:47.155687 kubelet[1788]: E0213 16:18:47.155382 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:48.106303 sshd[4314]: Invalid user test from 103.10.44.105 port 58290 Feb 13 16:18:48.155679 kubelet[1788]: E0213 16:18:48.155603 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:48.302270 sshd[4314]: Received disconnect from 103.10.44.105 port 58290:11: Bye Bye [preauth] Feb 13 16:18:48.302270 sshd[4314]: Disconnected from invalid user test 103.10.44.105 port 58290 [preauth] Feb 13 16:18:48.300889 systemd[1]: sshd@21-64.227.101.255:22-103.10.44.105:58290.service: Deactivated successfully. Feb 13 16:18:48.615355 kubelet[1788]: I0213 16:18:48.614854 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-69d5998dc-cpdsh" podStartSLOduration=6.867449899 podStartE2EDuration="18.614785747s" podCreationTimestamp="2025-02-13 16:18:30 +0000 UTC" firstStartedPulling="2025-02-13 16:18:33.180432721 +0000 UTC m=+31.875410587" lastFinishedPulling="2025-02-13 16:18:44.927768413 +0000 UTC m=+43.622746435" observedRunningTime="2025-02-13 16:18:46.161034025 +0000 UTC m=+44.856011920" watchObservedRunningTime="2025-02-13 16:18:48.614785747 +0000 UTC m=+47.309763638" Feb 13 16:18:48.617834 kubelet[1788]: I0213 16:18:48.616775 1788 topology_manager.go:215] "Topology Admit Handler" podUID="62f1d67d-dfad-4296-a4ea-74226d9482dc" podNamespace="default" podName="nfs-server-provisioner-0" Feb 13 16:18:48.653378 systemd[1]: Created slice kubepods-besteffort-pod62f1d67d_dfad_4296_a4ea_74226d9482dc.slice - libcontainer container kubepods-besteffort-pod62f1d67d_dfad_4296_a4ea_74226d9482dc.slice. Feb 13 16:18:48.699164 containerd[1474]: time="2025-02-13T16:18:48.699066901Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:48.701091 containerd[1474]: time="2025-02-13T16:18:48.700758216Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 16:18:48.701999 containerd[1474]: time="2025-02-13T16:18:48.701938296Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:48.706836 containerd[1474]: time="2025-02-13T16:18:48.706739975Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:48.708906 containerd[1474]: time="2025-02-13T16:18:48.708309531Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 1.949997822s" Feb 13 16:18:48.708906 containerd[1474]: time="2025-02-13T16:18:48.708373285Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 16:18:48.711845 containerd[1474]: time="2025-02-13T16:18:48.711769886Z" level=info msg="CreateContainer within sandbox \"b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 16:18:48.718954 kubelet[1788]: I0213 16:18:48.718855 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9xrf\" (UniqueName: \"kubernetes.io/projected/62f1d67d-dfad-4296-a4ea-74226d9482dc-kube-api-access-w9xrf\") pod \"nfs-server-provisioner-0\" (UID: \"62f1d67d-dfad-4296-a4ea-74226d9482dc\") " pod="default/nfs-server-provisioner-0" Feb 13 16:18:48.718954 kubelet[1788]: I0213 16:18:48.718955 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/62f1d67d-dfad-4296-a4ea-74226d9482dc-data\") pod \"nfs-server-provisioner-0\" (UID: \"62f1d67d-dfad-4296-a4ea-74226d9482dc\") " pod="default/nfs-server-provisioner-0" Feb 13 16:18:48.744777 containerd[1474]: time="2025-02-13T16:18:48.744719239Z" level=info msg="CreateContainer within sandbox \"b993ff052db9d9e69a2fc4d0016ebfcc6135522bda46d4350fe713bab31205de\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4facb8f00b9e6ae31d49e2f9cf420a3327f4efdc2e72dc2761269ffdf07ba505\"" Feb 13 16:18:48.747496 containerd[1474]: time="2025-02-13T16:18:48.747429729Z" level=info msg="StartContainer for \"4facb8f00b9e6ae31d49e2f9cf420a3327f4efdc2e72dc2761269ffdf07ba505\"" Feb 13 16:18:48.809784 systemd[1]: Started cri-containerd-4facb8f00b9e6ae31d49e2f9cf420a3327f4efdc2e72dc2761269ffdf07ba505.scope - libcontainer container 4facb8f00b9e6ae31d49e2f9cf420a3327f4efdc2e72dc2761269ffdf07ba505. Feb 13 16:18:48.877153 containerd[1474]: time="2025-02-13T16:18:48.876917699Z" level=info msg="StartContainer for \"4facb8f00b9e6ae31d49e2f9cf420a3327f4efdc2e72dc2761269ffdf07ba505\" returns successfully" Feb 13 16:18:48.964475 containerd[1474]: time="2025-02-13T16:18:48.963279572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:62f1d67d-dfad-4296-a4ea-74226d9482dc,Namespace:default,Attempt:0,}" Feb 13 16:18:49.088115 kubelet[1788]: I0213 16:18:49.088038 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-jlrrr" podStartSLOduration=31.577309265 podStartE2EDuration="47.087971062s" podCreationTimestamp="2025-02-13 16:18:02 +0000 UTC" firstStartedPulling="2025-02-13 16:18:33.198782538 +0000 UTC m=+31.893760406" lastFinishedPulling="2025-02-13 16:18:48.709444331 +0000 UTC m=+47.404422203" observedRunningTime="2025-02-13 16:18:49.08558296 +0000 UTC m=+47.780560844" watchObservedRunningTime="2025-02-13 16:18:49.087971062 +0000 UTC m=+47.782948950" Feb 13 16:18:49.158019 kubelet[1788]: E0213 16:18:49.156676 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:49.424637 kubelet[1788]: I0213 16:18:49.424474 1788 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 16:18:49.426309 kubelet[1788]: I0213 16:18:49.426267 1788 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 16:18:49.448518 systemd-networkd[1377]: cali60e51b789ff: Link UP Feb 13 16:18:49.449034 systemd-networkd[1377]: cali60e51b789ff: Gained carrier Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.167 [INFO][4405] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.221 [INFO][4405] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {64.227.101.255-k8s-nfs--server--provisioner--0-eth0 nfs-server-provisioner- default 62f1d67d-dfad-4296-a4ea-74226d9482dc 1488 0 2025-02-13 16:18:48 +0000 UTC map[app:nfs-server-provisioner apps.kubernetes.io/pod-index:0 chart:nfs-server-provisioner-1.8.0 controller-revision-hash:nfs-server-provisioner-d5cbb7f57 heritage:Helm projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:nfs-server-provisioner release:nfs-server-provisioner statefulset.kubernetes.io/pod-name:nfs-server-provisioner-0] map[] [] [] []} {k8s 64.227.101.255 nfs-server-provisioner-0 eth0 nfs-server-provisioner [] [] [kns.default ksa.default.nfs-server-provisioner] cali60e51b789ff [{nfs TCP 2049 0 } {nfs-udp UDP 2049 0 } {nlockmgr TCP 32803 0 } {nlockmgr-udp UDP 32803 0 } {mountd TCP 20048 0 } {mountd-udp UDP 20048 0 } {rquotad TCP 875 0 } {rquotad-udp UDP 875 0 } {rpcbind TCP 111 0 } {rpcbind-udp UDP 111 0 } {statd TCP 662 0 } {statd-udp UDP 662 0 }] []}} ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.221 [INFO][4405] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.292 [INFO][4418] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" HandleID="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Workload="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.338 [INFO][4418] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" HandleID="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Workload="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004d22e0), Attrs:map[string]string{"namespace":"default", "node":"64.227.101.255", "pod":"nfs-server-provisioner-0", "timestamp":"2025-02-13 16:18:49.292846572 +0000 UTC"}, Hostname:"64.227.101.255", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.338 [INFO][4418] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.339 [INFO][4418] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.339 [INFO][4418] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '64.227.101.255' Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.344 [INFO][4418] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.358 [INFO][4418] ipam/ipam.go 372: Looking up existing affinities for host host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.374 [INFO][4418] ipam/ipam.go 489: Trying affinity for 192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.379 [INFO][4418] ipam/ipam.go 155: Attempting to load block cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.388 [INFO][4418] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.389 [INFO][4418] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.119.128/26 handle="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.394 [INFO][4418] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.410 [INFO][4418] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.119.128/26 handle="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.430 [INFO][4418] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.119.132/26] block=192.168.119.128/26 handle="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.430 [INFO][4418] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.119.132/26] handle="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" host="64.227.101.255" Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.430 [INFO][4418] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:18:49.482556 containerd[1474]: 2025-02-13 16:18:49.430 [INFO][4418] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.132/26] IPv6=[] ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" HandleID="k8s-pod-network.f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Workload="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" Feb 13 16:18:49.483203 containerd[1474]: 2025-02-13 16:18:49.436 [INFO][4405] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"62f1d67d-dfad-4296-a4ea-74226d9482dc", ResourceVersion:"1488", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.119.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:49.483203 containerd[1474]: 2025-02-13 16:18:49.437 [INFO][4405] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.119.132/32] ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" Feb 13 16:18:49.483203 containerd[1474]: 2025-02-13 16:18:49.437 [INFO][4405] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60e51b789ff ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" Feb 13 16:18:49.483203 containerd[1474]: 2025-02-13 16:18:49.449 [INFO][4405] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" Feb 13 16:18:49.483806 containerd[1474]: 2025-02-13 16:18:49.449 [INFO][4405] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-nfs--server--provisioner--0-eth0", GenerateName:"nfs-server-provisioner-", Namespace:"default", SelfLink:"", UID:"62f1d67d-dfad-4296-a4ea-74226d9482dc", ResourceVersion:"1488", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"nfs-server-provisioner", "apps.kubernetes.io/pod-index":"0", "chart":"nfs-server-provisioner-1.8.0", "controller-revision-hash":"nfs-server-provisioner-d5cbb7f57", "heritage":"Helm", "projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"nfs-server-provisioner", "release":"nfs-server-provisioner", "statefulset.kubernetes.io/pod-name":"nfs-server-provisioner-0"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa", Pod:"nfs-server-provisioner-0", Endpoint:"eth0", ServiceAccountName:"nfs-server-provisioner", IPNetworks:[]string{"192.168.119.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.nfs-server-provisioner"}, InterfaceName:"cali60e51b789ff", MAC:"ae:4b:35:30:24:f3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"nfs", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nfs-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x801, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"nlockmgr-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x8023, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"mountd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x4e50, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rquotad-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x36b, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"rpcbind-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x6f, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x296, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"statd-udp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x296, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:18:49.483806 containerd[1474]: 2025-02-13 16:18:49.473 [INFO][4405] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa" Namespace="default" Pod="nfs-server-provisioner-0" WorkloadEndpoint="64.227.101.255-k8s-nfs--server--provisioner--0-eth0" Feb 13 16:18:49.603312 containerd[1474]: time="2025-02-13T16:18:49.602459432Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:18:49.603312 containerd[1474]: time="2025-02-13T16:18:49.602577097Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:18:49.603312 containerd[1474]: time="2025-02-13T16:18:49.602595234Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:49.603312 containerd[1474]: time="2025-02-13T16:18:49.602932071Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:18:49.635797 systemd[1]: Started cri-containerd-f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa.scope - libcontainer container f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa. Feb 13 16:18:49.730337 containerd[1474]: time="2025-02-13T16:18:49.730224075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:nfs-server-provisioner-0,Uid:62f1d67d-dfad-4296-a4ea-74226d9482dc,Namespace:default,Attempt:0,} returns sandbox id \"f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa\"" Feb 13 16:18:49.738517 containerd[1474]: time="2025-02-13T16:18:49.736182244Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\"" Feb 13 16:18:50.159331 kubelet[1788]: E0213 16:18:50.159126 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:50.767309 systemd-networkd[1377]: cali60e51b789ff: Gained IPv6LL Feb 13 16:18:50.987802 kubelet[1788]: I0213 16:18:50.987747 1788 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:18:51.015972 kubelet[1788]: E0213 16:18:51.015357 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:51.142278 kubelet[1788]: E0213 16:18:51.142121 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:18:51.159623 kubelet[1788]: E0213 16:18:51.159555 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:52.160837 kubelet[1788]: E0213 16:18:52.160572 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:52.343858 kernel: bpftool[4576]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 16:18:52.799942 systemd-networkd[1377]: vxlan.calico: Link UP Feb 13 16:18:52.799960 systemd-networkd[1377]: vxlan.calico: Gained carrier Feb 13 16:18:53.161901 kubelet[1788]: E0213 16:18:53.161293 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:53.673776 systemd[1]: Started sshd@22-64.227.101.255:22-113.193.234.210:14766.service - OpenSSH per-connection server daemon (113.193.234.210:14766). Feb 13 16:18:53.730119 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount263546813.mount: Deactivated successfully. Feb 13 16:18:54.031550 systemd-networkd[1377]: vxlan.calico: Gained IPv6LL Feb 13 16:18:54.162996 kubelet[1788]: E0213 16:18:54.162940 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:55.165628 kubelet[1788]: E0213 16:18:55.165557 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:55.215565 sshd[4652]: Invalid user almacen from 113.193.234.210 port 14766 Feb 13 16:18:55.499786 sshd[4652]: Received disconnect from 113.193.234.210 port 14766:11: Bye Bye [preauth] Feb 13 16:18:55.499786 sshd[4652]: Disconnected from invalid user almacen 113.193.234.210 port 14766 [preauth] Feb 13 16:18:55.503348 systemd[1]: sshd@22-64.227.101.255:22-113.193.234.210:14766.service: Deactivated successfully. Feb 13 16:18:56.166895 kubelet[1788]: E0213 16:18:56.166827 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:57.129311 containerd[1474]: time="2025-02-13T16:18:57.128730949Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:57.130814 containerd[1474]: time="2025-02-13T16:18:57.130738082Z" level=info msg="stop pulling image registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8: active requests=0, bytes read=91039406" Feb 13 16:18:57.133283 containerd[1474]: time="2025-02-13T16:18:57.131940688Z" level=info msg="ImageCreate event name:\"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:57.139528 containerd[1474]: time="2025-02-13T16:18:57.139467358Z" level=info msg="Pulled image \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" with image id \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\", repo tag \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\", repo digest \"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\", size \"91036984\" in 7.403202539s" Feb 13 16:18:57.139934 containerd[1474]: time="2025-02-13T16:18:57.139890461Z" level=info msg="PullImage \"registry.k8s.io/sig-storage/nfs-provisioner:v4.0.8\" returns image reference \"sha256:fd0b16f70b66b72bcb2f91d556fa33eba02729c44ffc5f2c16130e7f9fbed3c4\"" Feb 13 16:18:57.140184 containerd[1474]: time="2025-02-13T16:18:57.139769800Z" level=info msg="ImageCreate event name:\"registry.k8s.io/sig-storage/nfs-provisioner@sha256:c825f3d5e28bde099bd7a3daace28772d412c9157ad47fa752a9ad0baafc118d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:18:57.152299 containerd[1474]: time="2025-02-13T16:18:57.152224013Z" level=info msg="CreateContainer within sandbox \"f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa\" for container &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,}" Feb 13 16:18:57.168001 kubelet[1788]: E0213 16:18:57.167879 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:57.198334 containerd[1474]: time="2025-02-13T16:18:57.197285763Z" level=info msg="CreateContainer within sandbox \"f17fbcf26607ddf1dc40ac277b0f4ddf0f9522f29c10517bfc9f5f07693c64fa\" for &ContainerMetadata{Name:nfs-server-provisioner,Attempt:0,} returns container id \"259725e20c5ff99c6e316c89dc552b824478f5ca9a4dafc96880d3e09bae2a1c\"" Feb 13 16:18:57.202282 containerd[1474]: time="2025-02-13T16:18:57.200588678Z" level=info msg="StartContainer for \"259725e20c5ff99c6e316c89dc552b824478f5ca9a4dafc96880d3e09bae2a1c\"" Feb 13 16:18:57.272854 systemd[1]: Started cri-containerd-259725e20c5ff99c6e316c89dc552b824478f5ca9a4dafc96880d3e09bae2a1c.scope - libcontainer container 259725e20c5ff99c6e316c89dc552b824478f5ca9a4dafc96880d3e09bae2a1c. Feb 13 16:18:57.346159 containerd[1474]: time="2025-02-13T16:18:57.345987897Z" level=info msg="StartContainer for \"259725e20c5ff99c6e316c89dc552b824478f5ca9a4dafc96880d3e09bae2a1c\" returns successfully" Feb 13 16:18:58.168505 kubelet[1788]: E0213 16:18:58.168435 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:18:58.174751 kubelet[1788]: I0213 16:18:58.173075 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="default/nfs-server-provisioner-0" podStartSLOduration=2.765691875 podStartE2EDuration="10.172991658s" podCreationTimestamp="2025-02-13 16:18:48 +0000 UTC" firstStartedPulling="2025-02-13 16:18:49.73545382 +0000 UTC m=+48.430431702" lastFinishedPulling="2025-02-13 16:18:57.142753599 +0000 UTC m=+55.837731485" observedRunningTime="2025-02-13 16:18:58.172337146 +0000 UTC m=+56.867315044" watchObservedRunningTime="2025-02-13 16:18:58.172991658 +0000 UTC m=+56.867969543" Feb 13 16:18:58.176576 systemd[1]: run-containerd-runc-k8s.io-259725e20c5ff99c6e316c89dc552b824478f5ca9a4dafc96880d3e09bae2a1c-runc.VrMN6M.mount: Deactivated successfully. Feb 13 16:18:59.169730 kubelet[1788]: E0213 16:18:59.169624 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:00.170343 kubelet[1788]: E0213 16:19:00.170272 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:01.171615 kubelet[1788]: E0213 16:19:01.171535 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:02.107778 kubelet[1788]: E0213 16:19:02.107703 1788 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:02.188986 kubelet[1788]: E0213 16:19:02.188935 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:02.219539 containerd[1474]: time="2025-02-13T16:19:02.219461901Z" level=info msg="StopPodSandbox for \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\"" Feb 13 16:19:02.220353 containerd[1474]: time="2025-02-13T16:19:02.219584686Z" level=info msg="TearDown network for sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" successfully" Feb 13 16:19:02.220353 containerd[1474]: time="2025-02-13T16:19:02.219596528Z" level=info msg="StopPodSandbox for \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" returns successfully" Feb 13 16:19:02.233760 containerd[1474]: time="2025-02-13T16:19:02.230136086Z" level=info msg="RemovePodSandbox for \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\"" Feb 13 16:19:02.241981 containerd[1474]: time="2025-02-13T16:19:02.241871491Z" level=info msg="Forcibly stopping sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\"" Feb 13 16:19:02.242196 containerd[1474]: time="2025-02-13T16:19:02.242073291Z" level=info msg="TearDown network for sandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" successfully" Feb 13 16:19:02.253329 containerd[1474]: time="2025-02-13T16:19:02.253201068Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.254211 containerd[1474]: time="2025-02-13T16:19:02.253641227Z" level=info msg="RemovePodSandbox \"396afb5958e8e50662d40172fea03eb7d14c52f872aab9499acdc9c5b1fdbad8\" returns successfully" Feb 13 16:19:02.254875 containerd[1474]: time="2025-02-13T16:19:02.254755562Z" level=info msg="StopPodSandbox for \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\"" Feb 13 16:19:02.255211 containerd[1474]: time="2025-02-13T16:19:02.255119590Z" level=info msg="TearDown network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\" successfully" Feb 13 16:19:02.255211 containerd[1474]: time="2025-02-13T16:19:02.255163260Z" level=info msg="StopPodSandbox for \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\" returns successfully" Feb 13 16:19:02.257303 containerd[1474]: time="2025-02-13T16:19:02.255860468Z" level=info msg="RemovePodSandbox for \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\"" Feb 13 16:19:02.257303 containerd[1474]: time="2025-02-13T16:19:02.255898286Z" level=info msg="Forcibly stopping sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\"" Feb 13 16:19:02.257303 containerd[1474]: time="2025-02-13T16:19:02.256054859Z" level=info msg="TearDown network for sandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\" successfully" Feb 13 16:19:02.264783 containerd[1474]: time="2025-02-13T16:19:02.264594745Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.264783 containerd[1474]: time="2025-02-13T16:19:02.264740263Z" level=info msg="RemovePodSandbox \"8e65b5147a13f36652d1d5d2384fa5b3449a22733e27c0749d1030a5a148652b\" returns successfully" Feb 13 16:19:02.266109 containerd[1474]: time="2025-02-13T16:19:02.265870441Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:19:02.266109 containerd[1474]: time="2025-02-13T16:19:02.266047538Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:19:02.266109 containerd[1474]: time="2025-02-13T16:19:02.266068650Z" level=info msg="StopPodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:19:02.269730 containerd[1474]: time="2025-02-13T16:19:02.266945302Z" level=info msg="RemovePodSandbox for \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:19:02.269730 containerd[1474]: time="2025-02-13T16:19:02.266987095Z" level=info msg="Forcibly stopping sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\"" Feb 13 16:19:02.269730 containerd[1474]: time="2025-02-13T16:19:02.267108052Z" level=info msg="TearDown network for sandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" successfully" Feb 13 16:19:02.271948 containerd[1474]: time="2025-02-13T16:19:02.271855740Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.272260 containerd[1474]: time="2025-02-13T16:19:02.272205540Z" level=info msg="RemovePodSandbox \"89f51dbefea9cba4b462bd04d59e03071df02cbd0c96ac5d20d3e8f4b152f46b\" returns successfully" Feb 13 16:19:02.273091 containerd[1474]: time="2025-02-13T16:19:02.273057123Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:19:02.273522 containerd[1474]: time="2025-02-13T16:19:02.273481532Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:19:02.273653 containerd[1474]: time="2025-02-13T16:19:02.273632119Z" level=info msg="StopPodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:19:02.274304 containerd[1474]: time="2025-02-13T16:19:02.274277633Z" level=info msg="RemovePodSandbox for \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:19:02.274444 containerd[1474]: time="2025-02-13T16:19:02.274420297Z" level=info msg="Forcibly stopping sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\"" Feb 13 16:19:02.274667 containerd[1474]: time="2025-02-13T16:19:02.274604466Z" level=info msg="TearDown network for sandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" successfully" Feb 13 16:19:02.298180 containerd[1474]: time="2025-02-13T16:19:02.295883704Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.298180 containerd[1474]: time="2025-02-13T16:19:02.296004208Z" level=info msg="RemovePodSandbox \"c92ca00c9564c42616b7319a9e42268ceb1df238969d5122f09c1a2e0b31ea23\" returns successfully" Feb 13 16:19:02.313517 containerd[1474]: time="2025-02-13T16:19:02.311269532Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:19:02.313517 containerd[1474]: time="2025-02-13T16:19:02.311469699Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:19:02.313517 containerd[1474]: time="2025-02-13T16:19:02.311489974Z" level=info msg="StopPodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:19:02.328181 containerd[1474]: time="2025-02-13T16:19:02.328109477Z" level=info msg="RemovePodSandbox for \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:19:02.328181 containerd[1474]: time="2025-02-13T16:19:02.328181325Z" level=info msg="Forcibly stopping sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\"" Feb 13 16:19:02.328487 containerd[1474]: time="2025-02-13T16:19:02.328348163Z" level=info msg="TearDown network for sandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" successfully" Feb 13 16:19:02.332357 containerd[1474]: time="2025-02-13T16:19:02.332219315Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.332701 containerd[1474]: time="2025-02-13T16:19:02.332375547Z" level=info msg="RemovePodSandbox \"b1608fd3d94f289dcc5219479af61f3a4929c44fee3ae969d3447d27764f5624\" returns successfully" Feb 13 16:19:02.334865 containerd[1474]: time="2025-02-13T16:19:02.333480522Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:19:02.334865 containerd[1474]: time="2025-02-13T16:19:02.333871750Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:19:02.334865 containerd[1474]: time="2025-02-13T16:19:02.333900199Z" level=info msg="StopPodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:19:02.342896 containerd[1474]: time="2025-02-13T16:19:02.342484964Z" level=info msg="RemovePodSandbox for \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:19:02.342896 containerd[1474]: time="2025-02-13T16:19:02.342563657Z" level=info msg="Forcibly stopping sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\"" Feb 13 16:19:02.342896 containerd[1474]: time="2025-02-13T16:19:02.342707615Z" level=info msg="TearDown network for sandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" successfully" Feb 13 16:19:02.355657 containerd[1474]: time="2025-02-13T16:19:02.355321245Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.355657 containerd[1474]: time="2025-02-13T16:19:02.355480812Z" level=info msg="RemovePodSandbox \"e978852a3aec7adca42e6f2493c89dd419c0578104bbfb5341e5bb8a30452d9c\" returns successfully" Feb 13 16:19:02.402759 containerd[1474]: time="2025-02-13T16:19:02.402577492Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:19:02.405107 containerd[1474]: time="2025-02-13T16:19:02.403417942Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:19:02.405107 containerd[1474]: time="2025-02-13T16:19:02.403456739Z" level=info msg="StopPodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:19:02.405107 containerd[1474]: time="2025-02-13T16:19:02.404291002Z" level=info msg="RemovePodSandbox for \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:19:02.405107 containerd[1474]: time="2025-02-13T16:19:02.404331162Z" level=info msg="Forcibly stopping sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\"" Feb 13 16:19:02.405107 containerd[1474]: time="2025-02-13T16:19:02.404742346Z" level=info msg="TearDown network for sandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" successfully" Feb 13 16:19:02.416124 containerd[1474]: time="2025-02-13T16:19:02.415057180Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.416347 containerd[1474]: time="2025-02-13T16:19:02.416257347Z" level=info msg="RemovePodSandbox \"be7e5214c61d8d13b09340c5e6e86c96351d6f4fa47e201a41a24aef8d058d36\" returns successfully" Feb 13 16:19:02.417401 containerd[1474]: time="2025-02-13T16:19:02.417353076Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:19:02.417605 containerd[1474]: time="2025-02-13T16:19:02.417574235Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:19:02.417650 containerd[1474]: time="2025-02-13T16:19:02.417598976Z" level=info msg="StopPodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:19:02.419281 containerd[1474]: time="2025-02-13T16:19:02.418339380Z" level=info msg="RemovePodSandbox for \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:19:02.419281 containerd[1474]: time="2025-02-13T16:19:02.418385848Z" level=info msg="Forcibly stopping sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\"" Feb 13 16:19:02.419281 containerd[1474]: time="2025-02-13T16:19:02.418528340Z" level=info msg="TearDown network for sandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" successfully" Feb 13 16:19:02.422731 containerd[1474]: time="2025-02-13T16:19:02.422646766Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.423007 containerd[1474]: time="2025-02-13T16:19:02.422760265Z" level=info msg="RemovePodSandbox \"f060d4ffcedb2ff3de0beb367f93559c20bfe37330a58be4be28edac1ea95389\" returns successfully" Feb 13 16:19:02.423715 containerd[1474]: time="2025-02-13T16:19:02.423420933Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:19:02.423715 containerd[1474]: time="2025-02-13T16:19:02.423605351Z" level=info msg="TearDown network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" successfully" Feb 13 16:19:02.423715 containerd[1474]: time="2025-02-13T16:19:02.423626006Z" level=info msg="StopPodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" returns successfully" Feb 13 16:19:02.425605 containerd[1474]: time="2025-02-13T16:19:02.424400263Z" level=info msg="RemovePodSandbox for \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:19:02.425605 containerd[1474]: time="2025-02-13T16:19:02.424439914Z" level=info msg="Forcibly stopping sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\"" Feb 13 16:19:02.425605 containerd[1474]: time="2025-02-13T16:19:02.424559255Z" level=info msg="TearDown network for sandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" successfully" Feb 13 16:19:02.429941 containerd[1474]: time="2025-02-13T16:19:02.429862371Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.430488 containerd[1474]: time="2025-02-13T16:19:02.429975266Z" level=info msg="RemovePodSandbox \"3af24d50b0d0cdd58c79c79fad984f0973a8cc1bc0a5206cd0d71a682737a2c1\" returns successfully" Feb 13 16:19:02.431015 containerd[1474]: time="2025-02-13T16:19:02.430715061Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" Feb 13 16:19:02.431015 containerd[1474]: time="2025-02-13T16:19:02.430907824Z" level=info msg="TearDown network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" successfully" Feb 13 16:19:02.431015 containerd[1474]: time="2025-02-13T16:19:02.430927818Z" level=info msg="StopPodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" returns successfully" Feb 13 16:19:02.432131 containerd[1474]: time="2025-02-13T16:19:02.431943019Z" level=info msg="RemovePodSandbox for \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" Feb 13 16:19:02.432131 containerd[1474]: time="2025-02-13T16:19:02.432010178Z" level=info msg="Forcibly stopping sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\"" Feb 13 16:19:02.432686 containerd[1474]: time="2025-02-13T16:19:02.432370820Z" level=info msg="TearDown network for sandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" successfully" Feb 13 16:19:02.436165 containerd[1474]: time="2025-02-13T16:19:02.436075010Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.436564 containerd[1474]: time="2025-02-13T16:19:02.436432481Z" level=info msg="RemovePodSandbox \"2341f6a6732e841f03e77e77c24787de3fb6a9cb6250569c3a5dfd04369c8894\" returns successfully" Feb 13 16:19:02.437429 containerd[1474]: time="2025-02-13T16:19:02.437391560Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\"" Feb 13 16:19:02.438120 containerd[1474]: time="2025-02-13T16:19:02.437975008Z" level=info msg="TearDown network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" successfully" Feb 13 16:19:02.438120 containerd[1474]: time="2025-02-13T16:19:02.438043712Z" level=info msg="StopPodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" returns successfully" Feb 13 16:19:02.440809 containerd[1474]: time="2025-02-13T16:19:02.438920610Z" level=info msg="RemovePodSandbox for \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\"" Feb 13 16:19:02.440809 containerd[1474]: time="2025-02-13T16:19:02.438958443Z" level=info msg="Forcibly stopping sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\"" Feb 13 16:19:02.440809 containerd[1474]: time="2025-02-13T16:19:02.439080472Z" level=info msg="TearDown network for sandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" successfully" Feb 13 16:19:02.443760 containerd[1474]: time="2025-02-13T16:19:02.443695770Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.444097 containerd[1474]: time="2025-02-13T16:19:02.444066146Z" level=info msg="RemovePodSandbox \"3e1388d5798655a6e06245acef8a8d6616921aa3ad882a4092e0be6666473585\" returns successfully" Feb 13 16:19:02.444999 containerd[1474]: time="2025-02-13T16:19:02.444966547Z" level=info msg="StopPodSandbox for \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\"" Feb 13 16:19:02.445498 containerd[1474]: time="2025-02-13T16:19:02.445472131Z" level=info msg="TearDown network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" successfully" Feb 13 16:19:02.445637 containerd[1474]: time="2025-02-13T16:19:02.445618123Z" level=info msg="StopPodSandbox for \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" returns successfully" Feb 13 16:19:02.446383 containerd[1474]: time="2025-02-13T16:19:02.446345414Z" level=info msg="RemovePodSandbox for \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\"" Feb 13 16:19:02.446508 containerd[1474]: time="2025-02-13T16:19:02.446490493Z" level=info msg="Forcibly stopping sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\"" Feb 13 16:19:02.446788 containerd[1474]: time="2025-02-13T16:19:02.446725105Z" level=info msg="TearDown network for sandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" successfully" Feb 13 16:19:02.456925 containerd[1474]: time="2025-02-13T16:19:02.456843184Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.457329 containerd[1474]: time="2025-02-13T16:19:02.457300399Z" level=info msg="RemovePodSandbox \"2708b876a55cdaf066511f7a5ccab72f8263264acd632ad5f2c4d07396d58d7f\" returns successfully" Feb 13 16:19:02.458207 containerd[1474]: time="2025-02-13T16:19:02.458174548Z" level=info msg="StopPodSandbox for \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\"" Feb 13 16:19:02.461635 containerd[1474]: time="2025-02-13T16:19:02.461576902Z" level=info msg="TearDown network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\" successfully" Feb 13 16:19:02.461989 containerd[1474]: time="2025-02-13T16:19:02.461966456Z" level=info msg="StopPodSandbox for \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\" returns successfully" Feb 13 16:19:02.462745 containerd[1474]: time="2025-02-13T16:19:02.462719584Z" level=info msg="RemovePodSandbox for \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\"" Feb 13 16:19:02.463136 containerd[1474]: time="2025-02-13T16:19:02.463113279Z" level=info msg="Forcibly stopping sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\"" Feb 13 16:19:02.463652 containerd[1474]: time="2025-02-13T16:19:02.463521795Z" level=info msg="TearDown network for sandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\" successfully" Feb 13 16:19:02.467117 containerd[1474]: time="2025-02-13T16:19:02.466877184Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.467117 containerd[1474]: time="2025-02-13T16:19:02.466978713Z" level=info msg="RemovePodSandbox \"534c0fbef3f021483028a6ddff1dca61ee31b16e3f415d5f928adb570c651f84\" returns successfully" Feb 13 16:19:02.468200 containerd[1474]: time="2025-02-13T16:19:02.467929697Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:19:02.468200 containerd[1474]: time="2025-02-13T16:19:02.468077672Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:19:02.468200 containerd[1474]: time="2025-02-13T16:19:02.468095354Z" level=info msg="StopPodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:19:02.471096 containerd[1474]: time="2025-02-13T16:19:02.468846700Z" level=info msg="RemovePodSandbox for \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:19:02.477868 containerd[1474]: time="2025-02-13T16:19:02.477786863Z" level=info msg="Forcibly stopping sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\"" Feb 13 16:19:02.478522 containerd[1474]: time="2025-02-13T16:19:02.478444310Z" level=info msg="TearDown network for sandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" successfully" Feb 13 16:19:02.491618 containerd[1474]: time="2025-02-13T16:19:02.491543104Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.494473 containerd[1474]: time="2025-02-13T16:19:02.491658338Z" level=info msg="RemovePodSandbox \"3461574141d4e11dcd8033a4087e60f7d6037790256cdfd5f0e13a67ba7006f5\" returns successfully" Feb 13 16:19:02.494473 containerd[1474]: time="2025-02-13T16:19:02.492500463Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:19:02.494473 containerd[1474]: time="2025-02-13T16:19:02.492917564Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:19:02.494473 containerd[1474]: time="2025-02-13T16:19:02.492942895Z" level=info msg="StopPodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:19:02.494839 containerd[1474]: time="2025-02-13T16:19:02.494493800Z" level=info msg="RemovePodSandbox for \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:19:02.494839 containerd[1474]: time="2025-02-13T16:19:02.494534960Z" level=info msg="Forcibly stopping sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\"" Feb 13 16:19:02.494839 containerd[1474]: time="2025-02-13T16:19:02.494690407Z" level=info msg="TearDown network for sandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" successfully" Feb 13 16:19:02.504288 containerd[1474]: time="2025-02-13T16:19:02.504193861Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.504636 containerd[1474]: time="2025-02-13T16:19:02.504320693Z" level=info msg="RemovePodSandbox \"588f4c8a96fd105f4c2903a02754b19dd35c8e1692593729a01328200aa3cfdf\" returns successfully" Feb 13 16:19:02.506382 containerd[1474]: time="2025-02-13T16:19:02.505511536Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:19:02.506382 containerd[1474]: time="2025-02-13T16:19:02.505712304Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:19:02.506382 containerd[1474]: time="2025-02-13T16:19:02.505736039Z" level=info msg="StopPodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:19:02.507270 containerd[1474]: time="2025-02-13T16:19:02.507185067Z" level=info msg="RemovePodSandbox for \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:19:02.507488 containerd[1474]: time="2025-02-13T16:19:02.507340100Z" level=info msg="Forcibly stopping sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\"" Feb 13 16:19:02.509271 containerd[1474]: time="2025-02-13T16:19:02.507742254Z" level=info msg="TearDown network for sandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" successfully" Feb 13 16:19:02.515628 containerd[1474]: time="2025-02-13T16:19:02.515209001Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.516596 containerd[1474]: time="2025-02-13T16:19:02.515405143Z" level=info msg="RemovePodSandbox \"b1d6d615ae342d902dfe582f2ed685b8390b7ec05ebdf1990beef011257315d6\" returns successfully" Feb 13 16:19:02.517619 containerd[1474]: time="2025-02-13T16:19:02.517339535Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:19:02.517619 containerd[1474]: time="2025-02-13T16:19:02.517570108Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:19:02.517619 containerd[1474]: time="2025-02-13T16:19:02.517588806Z" level=info msg="StopPodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:19:02.518865 containerd[1474]: time="2025-02-13T16:19:02.518829175Z" level=info msg="RemovePodSandbox for \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:19:02.518963 containerd[1474]: time="2025-02-13T16:19:02.518875965Z" level=info msg="Forcibly stopping sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\"" Feb 13 16:19:02.519085 containerd[1474]: time="2025-02-13T16:19:02.519060883Z" level=info msg="TearDown network for sandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" successfully" Feb 13 16:19:02.532054 containerd[1474]: time="2025-02-13T16:19:02.531871943Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.532054 containerd[1474]: time="2025-02-13T16:19:02.532004384Z" level=info msg="RemovePodSandbox \"e3a8d57997d2ff337d04bdd188722339f10f8a60651f9223bf893651ba539e16\" returns successfully" Feb 13 16:19:02.533401 containerd[1474]: time="2025-02-13T16:19:02.533004686Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:19:02.533855 containerd[1474]: time="2025-02-13T16:19:02.533646888Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:19:02.533855 containerd[1474]: time="2025-02-13T16:19:02.533672865Z" level=info msg="StopPodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:19:02.537103 containerd[1474]: time="2025-02-13T16:19:02.534455083Z" level=info msg="RemovePodSandbox for \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:19:02.537103 containerd[1474]: time="2025-02-13T16:19:02.534493822Z" level=info msg="Forcibly stopping sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\"" Feb 13 16:19:02.537103 containerd[1474]: time="2025-02-13T16:19:02.534602650Z" level=info msg="TearDown network for sandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" successfully" Feb 13 16:19:02.540744 containerd[1474]: time="2025-02-13T16:19:02.540627353Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.541088 containerd[1474]: time="2025-02-13T16:19:02.541059713Z" level=info msg="RemovePodSandbox \"bcda1ed5caf2e135eed04b6f2bc55932a1eedbc3e0f58b97dee214bbd3b130b6\" returns successfully" Feb 13 16:19:02.542425 containerd[1474]: time="2025-02-13T16:19:02.542380692Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:19:02.542813 containerd[1474]: time="2025-02-13T16:19:02.542786408Z" level=info msg="TearDown network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" successfully" Feb 13 16:19:02.542962 containerd[1474]: time="2025-02-13T16:19:02.542909578Z" level=info msg="StopPodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" returns successfully" Feb 13 16:19:02.546588 containerd[1474]: time="2025-02-13T16:19:02.546309362Z" level=info msg="RemovePodSandbox for \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:19:02.548179 containerd[1474]: time="2025-02-13T16:19:02.548034292Z" level=info msg="Forcibly stopping sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\"" Feb 13 16:19:02.550516 containerd[1474]: time="2025-02-13T16:19:02.550406051Z" level=info msg="TearDown network for sandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" successfully" Feb 13 16:19:02.557967 containerd[1474]: time="2025-02-13T16:19:02.557727160Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.557967 containerd[1474]: time="2025-02-13T16:19:02.557818175Z" level=info msg="RemovePodSandbox \"a1b514f0c25daf79eb54839c3c950930805e5b8a91c6a2ba3cbea514265fe2b1\" returns successfully" Feb 13 16:19:02.558823 containerd[1474]: time="2025-02-13T16:19:02.558752120Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" Feb 13 16:19:02.558956 containerd[1474]: time="2025-02-13T16:19:02.558931963Z" level=info msg="TearDown network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" successfully" Feb 13 16:19:02.558982 containerd[1474]: time="2025-02-13T16:19:02.558957593Z" level=info msg="StopPodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" returns successfully" Feb 13 16:19:02.559718 containerd[1474]: time="2025-02-13T16:19:02.559454727Z" level=info msg="RemovePodSandbox for \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" Feb 13 16:19:02.559718 containerd[1474]: time="2025-02-13T16:19:02.559495976Z" level=info msg="Forcibly stopping sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\"" Feb 13 16:19:02.559718 containerd[1474]: time="2025-02-13T16:19:02.559596333Z" level=info msg="TearDown network for sandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" successfully" Feb 13 16:19:02.565107 containerd[1474]: time="2025-02-13T16:19:02.564624068Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.565107 containerd[1474]: time="2025-02-13T16:19:02.564819575Z" level=info msg="RemovePodSandbox \"625ccdaa9c9c2358350a3bac48c5cd277283572c2a103d6f59d1758be5691f73\" returns successfully" Feb 13 16:19:02.566151 containerd[1474]: time="2025-02-13T16:19:02.566102214Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\"" Feb 13 16:19:02.566426 containerd[1474]: time="2025-02-13T16:19:02.566394045Z" level=info msg="TearDown network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" successfully" Feb 13 16:19:02.566426 containerd[1474]: time="2025-02-13T16:19:02.566422481Z" level=info msg="StopPodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" returns successfully" Feb 13 16:19:02.567074 containerd[1474]: time="2025-02-13T16:19:02.567040131Z" level=info msg="RemovePodSandbox for \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\"" Feb 13 16:19:02.567074 containerd[1474]: time="2025-02-13T16:19:02.567075226Z" level=info msg="Forcibly stopping sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\"" Feb 13 16:19:02.584420 containerd[1474]: time="2025-02-13T16:19:02.584284001Z" level=info msg="TearDown network for sandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" successfully" Feb 13 16:19:02.589791 containerd[1474]: time="2025-02-13T16:19:02.589701458Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.589985 containerd[1474]: time="2025-02-13T16:19:02.589822789Z" level=info msg="RemovePodSandbox \"4308c19c2f88b5249d48f477b7d5cff2f2f569b2f9dc2b2bcca9b50813fffc84\" returns successfully" Feb 13 16:19:02.591111 containerd[1474]: time="2025-02-13T16:19:02.590826790Z" level=info msg="StopPodSandbox for \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\"" Feb 13 16:19:02.591111 containerd[1474]: time="2025-02-13T16:19:02.590994227Z" level=info msg="TearDown network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" successfully" Feb 13 16:19:02.591111 containerd[1474]: time="2025-02-13T16:19:02.591016229Z" level=info msg="StopPodSandbox for \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" returns successfully" Feb 13 16:19:02.592149 containerd[1474]: time="2025-02-13T16:19:02.591975495Z" level=info msg="RemovePodSandbox for \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\"" Feb 13 16:19:02.592149 containerd[1474]: time="2025-02-13T16:19:02.592057333Z" level=info msg="Forcibly stopping sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\"" Feb 13 16:19:02.596042 containerd[1474]: time="2025-02-13T16:19:02.592624521Z" level=info msg="TearDown network for sandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" successfully" Feb 13 16:19:02.602693 containerd[1474]: time="2025-02-13T16:19:02.601741847Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.602693 containerd[1474]: time="2025-02-13T16:19:02.601862421Z" level=info msg="RemovePodSandbox \"48952e6bd28c814600b544eaf0cee6141d833d79a5ac987603a38bccfddf5f2c\" returns successfully" Feb 13 16:19:02.604223 containerd[1474]: time="2025-02-13T16:19:02.603759993Z" level=info msg="StopPodSandbox for \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\"" Feb 13 16:19:02.604223 containerd[1474]: time="2025-02-13T16:19:02.603947609Z" level=info msg="TearDown network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\" successfully" Feb 13 16:19:02.604223 containerd[1474]: time="2025-02-13T16:19:02.603969295Z" level=info msg="StopPodSandbox for \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\" returns successfully" Feb 13 16:19:02.612156 containerd[1474]: time="2025-02-13T16:19:02.612009461Z" level=info msg="RemovePodSandbox for \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\"" Feb 13 16:19:02.612156 containerd[1474]: time="2025-02-13T16:19:02.612088045Z" level=info msg="Forcibly stopping sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\"" Feb 13 16:19:02.613353 containerd[1474]: time="2025-02-13T16:19:02.612937164Z" level=info msg="TearDown network for sandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\" successfully" Feb 13 16:19:02.637889 containerd[1474]: time="2025-02-13T16:19:02.637402684Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:19:02.638436 containerd[1474]: time="2025-02-13T16:19:02.638299025Z" level=info msg="RemovePodSandbox \"5a59c2e749617d96518ca3bd95fa6e060cdafca63544c0ff6ddac088b742654a\" returns successfully" Feb 13 16:19:03.190120 kubelet[1788]: E0213 16:19:03.190048 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:04.190565 kubelet[1788]: E0213 16:19:04.190492 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:04.302003 systemd[1]: run-containerd-runc-k8s.io-4d53c0c36f456a0688db1d1e608fd6f89a5856e7d7bd716c741001b91f41030c-runc.hM49NT.mount: Deactivated successfully. Feb 13 16:19:05.191097 kubelet[1788]: E0213 16:19:05.190907 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:06.191775 kubelet[1788]: E0213 16:19:06.191588 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:07.192268 kubelet[1788]: E0213 16:19:07.192134 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:08.193149 kubelet[1788]: E0213 16:19:08.193079 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:09.207183 kubelet[1788]: E0213 16:19:09.194621 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:10.207629 kubelet[1788]: E0213 16:19:10.207535 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:10.373358 systemd[1]: Started sshd@23-64.227.101.255:22-114.10.47.180:44330.service - OpenSSH per-connection server daemon (114.10.47.180:44330). Feb 13 16:19:11.208039 kubelet[1788]: E0213 16:19:11.207925 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:11.315996 systemd[1]: Started sshd@24-64.227.101.255:22-103.10.44.45:44084.service - OpenSSH per-connection server daemon (103.10.44.45:44084). Feb 13 16:19:11.528602 sshd[4823]: Invalid user support from 114.10.47.180 port 44330 Feb 13 16:19:11.731066 sshd[4823]: Received disconnect from 114.10.47.180 port 44330:11: Bye Bye [preauth] Feb 13 16:19:11.731066 sshd[4823]: Disconnected from invalid user support 114.10.47.180 port 44330 [preauth] Feb 13 16:19:11.732498 systemd[1]: sshd@23-64.227.101.255:22-114.10.47.180:44330.service: Deactivated successfully. Feb 13 16:19:12.208746 kubelet[1788]: E0213 16:19:12.208660 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:12.322782 sshd[4847]: Invalid user jhernandez from 103.10.44.45 port 44084 Feb 13 16:19:12.509143 sshd[4847]: Received disconnect from 103.10.44.45 port 44084:11: Bye Bye [preauth] Feb 13 16:19:12.509143 sshd[4847]: Disconnected from invalid user jhernandez 103.10.44.45 port 44084 [preauth] Feb 13 16:19:12.511081 systemd[1]: sshd@24-64.227.101.255:22-103.10.44.45:44084.service: Deactivated successfully. Feb 13 16:19:13.210097 kubelet[1788]: E0213 16:19:13.209986 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:13.351813 kubelet[1788]: E0213 16:19:13.351749 1788 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.3 67.207.67.2 67.207.67.3" Feb 13 16:19:14.211117 kubelet[1788]: E0213 16:19:14.211014 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:15.212004 kubelet[1788]: E0213 16:19:15.211843 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:16.213105 kubelet[1788]: E0213 16:19:16.213021 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:17.214214 kubelet[1788]: E0213 16:19:17.214099 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:18.215400 kubelet[1788]: E0213 16:19:18.215305 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:19.216649 kubelet[1788]: E0213 16:19:19.216549 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:20.065520 systemd[1]: Started sshd@25-64.227.101.255:22-155.4.245.222:64342.service - OpenSSH per-connection server daemon (155.4.245.222:64342). Feb 13 16:19:20.218063 kubelet[1788]: E0213 16:19:20.217983 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:21.218411 kubelet[1788]: E0213 16:19:21.218338 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:21.860423 sshd[4862]: Invalid user wfp from 155.4.245.222 port 64342 Feb 13 16:19:22.047027 sshd[4862]: Received disconnect from 155.4.245.222 port 64342:11: Bye Bye [preauth] Feb 13 16:19:22.047027 sshd[4862]: Disconnected from invalid user wfp 155.4.245.222 port 64342 [preauth] Feb 13 16:19:22.049820 systemd[1]: sshd@25-64.227.101.255:22-155.4.245.222:64342.service: Deactivated successfully. Feb 13 16:19:22.106990 kubelet[1788]: E0213 16:19:22.106923 1788 file.go:104] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:22.219031 kubelet[1788]: E0213 16:19:22.218946 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:22.502037 kubelet[1788]: I0213 16:19:22.501547 1788 topology_manager.go:215] "Topology Admit Handler" podUID="435bbe54-31e7-4cf1-9886-56f47f31842b" podNamespace="default" podName="test-pod-1" Feb 13 16:19:22.514038 systemd[1]: Created slice kubepods-besteffort-pod435bbe54_31e7_4cf1_9886_56f47f31842b.slice - libcontainer container kubepods-besteffort-pod435bbe54_31e7_4cf1_9886_56f47f31842b.slice. Feb 13 16:19:22.676525 kubelet[1788]: I0213 16:19:22.676334 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0d172f2e-f95b-4938-95ce-ba9bc52fe63b\" (UniqueName: \"kubernetes.io/nfs/435bbe54-31e7-4cf1-9886-56f47f31842b-pvc-0d172f2e-f95b-4938-95ce-ba9bc52fe63b\") pod \"test-pod-1\" (UID: \"435bbe54-31e7-4cf1-9886-56f47f31842b\") " pod="default/test-pod-1" Feb 13 16:19:22.676525 kubelet[1788]: I0213 16:19:22.676436 1788 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmfd\" (UniqueName: \"kubernetes.io/projected/435bbe54-31e7-4cf1-9886-56f47f31842b-kube-api-access-7rmfd\") pod \"test-pod-1\" (UID: \"435bbe54-31e7-4cf1-9886-56f47f31842b\") " pod="default/test-pod-1" Feb 13 16:19:22.833011 kernel: FS-Cache: Loaded Feb 13 16:19:22.936476 kernel: RPC: Registered named UNIX socket transport module. Feb 13 16:19:22.936857 kernel: RPC: Registered udp transport module. Feb 13 16:19:22.936901 kernel: RPC: Registered tcp transport module. Feb 13 16:19:22.937727 kernel: RPC: Registered tcp-with-tls transport module. Feb 13 16:19:22.941600 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Feb 13 16:19:23.222484 kubelet[1788]: E0213 16:19:23.220131 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:23.324417 kernel: NFS: Registering the id_resolver key type Feb 13 16:19:23.324620 kernel: Key type id_resolver registered Feb 13 16:19:23.326813 kernel: Key type id_legacy registered Feb 13 16:19:23.376126 nfsidmap[4881]: nss_getpwnam: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain '2.1-0-f194220f8f' Feb 13 16:19:23.383256 nfsidmap[4882]: nss_name_to_gid: name 'root@nfs-server-provisioner.default.svc.cluster.local' does not map into domain '2.1-0-f194220f8f' Feb 13 16:19:23.423515 containerd[1474]: time="2025-02-13T16:19:23.422731317Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:435bbe54-31e7-4cf1-9886-56f47f31842b,Namespace:default,Attempt:0,}" Feb 13 16:19:23.801180 systemd-networkd[1377]: cali5ec59c6bf6e: Link UP Feb 13 16:19:23.801936 systemd-networkd[1377]: cali5ec59c6bf6e: Gained carrier Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.520 [INFO][4884] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {64.227.101.255-k8s-test--pod--1-eth0 default 435bbe54-31e7-4cf1-9886-56f47f31842b 1604 0 2025-02-13 16:18:49 +0000 UTC map[projectcalico.org/namespace:default projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s 64.227.101.255 test-pod-1 eth0 default [] [] [kns.default ksa.default.default] cali5ec59c6bf6e [] []}} ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.521 [INFO][4884] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-eth0" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.667 [INFO][4894] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" HandleID="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Workload="64.227.101.255-k8s-test--pod--1-eth0" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.692 [INFO][4894] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" HandleID="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Workload="64.227.101.255-k8s-test--pod--1-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b230), Attrs:map[string]string{"namespace":"default", "node":"64.227.101.255", "pod":"test-pod-1", "timestamp":"2025-02-13 16:19:23.667952126 +0000 UTC"}, Hostname:"64.227.101.255", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.692 [INFO][4894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.692 [INFO][4894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.692 [INFO][4894] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host '64.227.101.255' Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.696 [INFO][4894] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.709 [INFO][4894] ipam/ipam.go 372: Looking up existing affinities for host host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.728 [INFO][4894] ipam/ipam.go 489: Trying affinity for 192.168.119.128/26 host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.734 [INFO][4894] ipam/ipam.go 155: Attempting to load block cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.741 [INFO][4894] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.119.128/26 host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.741 [INFO][4894] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.119.128/26 handle="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.752 [INFO][4894] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6 Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.761 [INFO][4894] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.119.128/26 handle="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.791 [INFO][4894] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.119.133/26] block=192.168.119.128/26 handle="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.792 [INFO][4894] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.119.133/26] handle="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" host="64.227.101.255" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.792 [INFO][4894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.792 [INFO][4894] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.119.133/26] IPv6=[] ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" HandleID="k8s-pod-network.ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Workload="64.227.101.255-k8s-test--pod--1-eth0" Feb 13 16:19:23.830025 containerd[1474]: 2025-02-13 16:19:23.795 [INFO][4884] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"435bbe54-31e7-4cf1-9886-56f47f31842b", ResourceVersion:"1604", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.119.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:19:23.833644 containerd[1474]: 2025-02-13 16:19:23.795 [INFO][4884] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.119.133/32] ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-eth0" Feb 13 16:19:23.833644 containerd[1474]: 2025-02-13 16:19:23.795 [INFO][4884] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ec59c6bf6e ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-eth0" Feb 13 16:19:23.833644 containerd[1474]: 2025-02-13 16:19:23.802 [INFO][4884] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-eth0" Feb 13 16:19:23.833644 containerd[1474]: 2025-02-13 16:19:23.805 [INFO][4884] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"64.227.101.255-k8s-test--pod--1-eth0", GenerateName:"", Namespace:"default", SelfLink:"", UID:"435bbe54-31e7-4cf1-9886-56f47f31842b", ResourceVersion:"1604", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 18, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"projectcalico.org/namespace":"default", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"64.227.101.255", ContainerID:"ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6", Pod:"test-pod-1", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.119.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.default", "ksa.default.default"}, InterfaceName:"cali5ec59c6bf6e", MAC:"52:96:0c:6c:8f:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:19:23.833644 containerd[1474]: 2025-02-13 16:19:23.818 [INFO][4884] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6" Namespace="default" Pod="test-pod-1" WorkloadEndpoint="64.227.101.255-k8s-test--pod--1-eth0" Feb 13 16:19:23.885098 containerd[1474]: time="2025-02-13T16:19:23.884627650Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:19:23.885098 containerd[1474]: time="2025-02-13T16:19:23.884710373Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:19:23.885098 containerd[1474]: time="2025-02-13T16:19:23.884738062Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:19:23.885098 containerd[1474]: time="2025-02-13T16:19:23.884872261Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:19:23.925165 systemd[1]: run-containerd-runc-k8s.io-ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6-runc.7nfO0c.mount: Deactivated successfully. Feb 13 16:19:23.936608 systemd[1]: Started cri-containerd-ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6.scope - libcontainer container ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6. Feb 13 16:19:24.007294 containerd[1474]: time="2025-02-13T16:19:24.007022665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:test-pod-1,Uid:435bbe54-31e7-4cf1-9886-56f47f31842b,Namespace:default,Attempt:0,} returns sandbox id \"ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6\"" Feb 13 16:19:24.043706 containerd[1474]: time="2025-02-13T16:19:24.043316984Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\"" Feb 13 16:19:24.221544 kubelet[1788]: E0213 16:19:24.221462 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:24.498916 containerd[1474]: time="2025-02-13T16:19:24.496092989Z" level=info msg="stop pulling image ghcr.io/flatcar/nginx:latest: active requests=0, bytes read=61" Feb 13 16:19:24.501947 containerd[1474]: time="2025-02-13T16:19:24.501109414Z" level=info msg="Pulled image \"ghcr.io/flatcar/nginx:latest\" with image id \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\", repo tag \"ghcr.io/flatcar/nginx:latest\", repo digest \"ghcr.io/flatcar/nginx@sha256:d9bc3da999da9f147f1277c7b18292486847e8f39f95fcf81d914d0c22815faf\", size \"73054371\" in 457.729599ms" Feb 13 16:19:24.501947 containerd[1474]: time="2025-02-13T16:19:24.501165997Z" level=info msg="PullImage \"ghcr.io/flatcar/nginx:latest\" returns image reference \"sha256:fe94eb5f0c9c8d0ca277aa8cd5940f1faf5970175bf373932babc578545deda8\"" Feb 13 16:19:24.504050 containerd[1474]: time="2025-02-13T16:19:24.502982361Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/nginx:latest\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:19:24.508603 containerd[1474]: time="2025-02-13T16:19:24.507873613Z" level=info msg="CreateContainer within sandbox \"ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6\" for container &ContainerMetadata{Name:test,Attempt:0,}" Feb 13 16:19:24.539601 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2791257540.mount: Deactivated successfully. Feb 13 16:19:24.540418 containerd[1474]: time="2025-02-13T16:19:24.540357788Z" level=info msg="CreateContainer within sandbox \"ee789eff2a55a5a639e92d89f77e9ab130786a6f6f88d036607f9c8b43dab2e6\" for &ContainerMetadata{Name:test,Attempt:0,} returns container id \"88e1da9de6e2da8c304a6b14590c8f63be2b6782690581994a764347882be200\"" Feb 13 16:19:24.543076 containerd[1474]: time="2025-02-13T16:19:24.543024112Z" level=info msg="StartContainer for \"88e1da9de6e2da8c304a6b14590c8f63be2b6782690581994a764347882be200\"" Feb 13 16:19:24.597633 systemd[1]: Started cri-containerd-88e1da9de6e2da8c304a6b14590c8f63be2b6782690581994a764347882be200.scope - libcontainer container 88e1da9de6e2da8c304a6b14590c8f63be2b6782690581994a764347882be200. Feb 13 16:19:24.650896 containerd[1474]: time="2025-02-13T16:19:24.650830961Z" level=info msg="StartContainer for \"88e1da9de6e2da8c304a6b14590c8f63be2b6782690581994a764347882be200\" returns successfully" Feb 13 16:19:25.006841 systemd-networkd[1377]: cali5ec59c6bf6e: Gained IPv6LL Feb 13 16:19:25.222260 kubelet[1788]: E0213 16:19:25.222144 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:25.320940 kubelet[1788]: I0213 16:19:25.320434 1788 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="default/test-pod-1" podStartSLOduration=35.860083 podStartE2EDuration="36.320350847s" podCreationTimestamp="2025-02-13 16:18:49 +0000 UTC" firstStartedPulling="2025-02-13 16:19:24.042362926 +0000 UTC m=+82.737340813" lastFinishedPulling="2025-02-13 16:19:24.502630774 +0000 UTC m=+83.197608660" observedRunningTime="2025-02-13 16:19:25.319779034 +0000 UTC m=+84.014756912" watchObservedRunningTime="2025-02-13 16:19:25.320350847 +0000 UTC m=+84.015328747" Feb 13 16:19:26.223197 kubelet[1788]: E0213 16:19:26.223023 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:27.223818 kubelet[1788]: E0213 16:19:27.223725 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:28.225029 kubelet[1788]: E0213 16:19:28.224905 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:29.225897 kubelet[1788]: E0213 16:19:29.225828 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:30.234182 kubelet[1788]: E0213 16:19:30.233627 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests" Feb 13 16:19:31.234610 kubelet[1788]: E0213 16:19:31.234535 1788 file_linux.go:61] "Unable to read config path" err="path does not exist, ignoring" path="/etc/kubernetes/manifests"