Mar 17 17:58:00.354195 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT_DYNAMIC Mon Mar 17 16:09:25 -00 2025 Mar 17 17:58:00.354226 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=2a4a0f64c0160ed10b339be09fdc9d7e265b13f78aefc87616e79bf13c00bb1c Mar 17 17:58:00.354241 kernel: BIOS-provided physical RAM map: Mar 17 17:58:00.354250 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 17 17:58:00.354259 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 17 17:58:00.354267 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 17 17:58:00.354278 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Mar 17 17:58:00.354287 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Mar 17 17:58:00.354296 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 17 17:58:00.354308 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 17 17:58:00.354317 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 17 17:58:00.354326 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 17 17:58:00.354334 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 17 17:58:00.354344 kernel: NX (Execute Disable) protection: active Mar 17 17:58:00.354355 kernel: APIC: Static calls initialized Mar 17 17:58:00.354367 kernel: SMBIOS 2.8 present. Mar 17 17:58:00.354377 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Mar 17 17:58:00.354386 kernel: Hypervisor detected: KVM Mar 17 17:58:00.354396 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 17 17:58:00.354405 kernel: kvm-clock: using sched offset of 3056447450 cycles Mar 17 17:58:00.354416 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 17 17:58:00.354425 kernel: tsc: Detected 2794.750 MHz processor Mar 17 17:58:00.354435 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 17 17:58:00.354445 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 17 17:58:00.354455 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Mar 17 17:58:00.354468 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 17 17:58:00.354478 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 17 17:58:00.354488 kernel: Using GB pages for direct mapping Mar 17 17:58:00.354498 kernel: ACPI: Early table checksum verification disabled Mar 17 17:58:00.354508 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Mar 17 17:58:00.354518 kernel: ACPI: RSDT 0x000000009CFE2408 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:58:00.354528 kernel: ACPI: FACP 0x000000009CFE21E8 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:58:00.354538 kernel: ACPI: DSDT 0x000000009CFE0040 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:58:00.354551 kernel: ACPI: FACS 0x000000009CFE0000 000040 Mar 17 17:58:00.354561 kernel: ACPI: APIC 0x000000009CFE22DC 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:58:00.354584 kernel: ACPI: HPET 0x000000009CFE236C 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:58:00.354594 kernel: ACPI: MCFG 0x000000009CFE23A4 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:58:00.354604 kernel: ACPI: WAET 0x000000009CFE23E0 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:58:00.354615 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21e8-0x9cfe22db] Mar 17 17:58:00.354624 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21e7] Mar 17 17:58:00.354640 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Mar 17 17:58:00.354653 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22dc-0x9cfe236b] Mar 17 17:58:00.354664 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe236c-0x9cfe23a3] Mar 17 17:58:00.354674 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23a4-0x9cfe23df] Mar 17 17:58:00.354684 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23e0-0x9cfe2407] Mar 17 17:58:00.354695 kernel: No NUMA configuration found Mar 17 17:58:00.354705 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Mar 17 17:58:00.354718 kernel: NODE_DATA(0) allocated [mem 0x9cfd6000-0x9cfdbfff] Mar 17 17:58:00.354727 kernel: Zone ranges: Mar 17 17:58:00.354737 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 17 17:58:00.354748 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Mar 17 17:58:00.354758 kernel: Normal empty Mar 17 17:58:00.354769 kernel: Movable zone start for each node Mar 17 17:58:00.354780 kernel: Early memory node ranges Mar 17 17:58:00.354790 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 17 17:58:00.354800 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Mar 17 17:58:00.354811 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Mar 17 17:58:00.354826 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 17:58:00.354836 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 17 17:58:00.354847 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Mar 17 17:58:00.354858 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 17 17:58:00.354868 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 17 17:58:00.354878 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 17 17:58:00.354889 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 17 17:58:00.354899 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 17 17:58:00.354909 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 17 17:58:00.354923 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 17 17:58:00.354933 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 17 17:58:00.354943 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 17 17:58:00.354953 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 17 17:58:00.354964 kernel: TSC deadline timer available Mar 17 17:58:00.354974 kernel: smpboot: Allowing 4 CPUs, 0 hotplug CPUs Mar 17 17:58:00.354984 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 17 17:58:00.355242 kernel: kvm-guest: KVM setup pv remote TLB flush Mar 17 17:58:00.355255 kernel: kvm-guest: setup PV sched yield Mar 17 17:58:00.355271 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 17 17:58:00.355282 kernel: Booting paravirtualized kernel on KVM Mar 17 17:58:00.355294 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 17 17:58:00.355304 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Mar 17 17:58:00.355315 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u524288 Mar 17 17:58:00.355325 kernel: pcpu-alloc: s197032 r8192 d32344 u524288 alloc=1*2097152 Mar 17 17:58:00.355335 kernel: pcpu-alloc: [0] 0 1 2 3 Mar 17 17:58:00.355345 kernel: kvm-guest: PV spinlocks enabled Mar 17 17:58:00.355356 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 17 17:58:00.355371 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=2a4a0f64c0160ed10b339be09fdc9d7e265b13f78aefc87616e79bf13c00bb1c Mar 17 17:58:00.355383 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:58:00.355393 kernel: random: crng init done Mar 17 17:58:00.355404 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:58:00.355414 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:58:00.355424 kernel: Fallback order for Node 0: 0 Mar 17 17:58:00.355435 kernel: Built 1 zonelists, mobility grouping on. Total pages: 632732 Mar 17 17:58:00.355445 kernel: Policy zone: DMA32 Mar 17 17:58:00.355456 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:58:00.355470 kernel: Memory: 2432544K/2571752K available (14336K kernel code, 2303K rwdata, 22860K rodata, 43476K init, 1596K bss, 138948K reserved, 0K cma-reserved) Mar 17 17:58:00.355481 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 17 17:58:00.355492 kernel: ftrace: allocating 37910 entries in 149 pages Mar 17 17:58:00.355503 kernel: ftrace: allocated 149 pages with 4 groups Mar 17 17:58:00.355551 kernel: Dynamic Preempt: voluntary Mar 17 17:58:00.355567 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:58:00.355599 kernel: rcu: RCU event tracing is enabled. Mar 17 17:58:00.355611 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 17 17:58:00.355625 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:58:00.355636 kernel: Rude variant of Tasks RCU enabled. Mar 17 17:58:00.355646 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:58:00.355658 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:58:00.355668 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 17 17:58:00.355679 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Mar 17 17:58:00.355689 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:58:00.355699 kernel: Console: colour VGA+ 80x25 Mar 17 17:58:00.355709 kernel: printk: console [ttyS0] enabled Mar 17 17:58:00.355719 kernel: ACPI: Core revision 20230628 Mar 17 17:58:00.355733 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 17 17:58:00.355743 kernel: APIC: Switch to symmetric I/O mode setup Mar 17 17:58:00.355754 kernel: x2apic enabled Mar 17 17:58:00.355764 kernel: APIC: Switched APIC routing to: physical x2apic Mar 17 17:58:00.355774 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Mar 17 17:58:00.355784 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Mar 17 17:58:00.355795 kernel: kvm-guest: setup PV IPIs Mar 17 17:58:00.355816 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 17 17:58:00.355828 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 17 17:58:00.355838 kernel: Calibrating delay loop (skipped) preset value.. 5589.50 BogoMIPS (lpj=2794750) Mar 17 17:58:00.355849 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 17 17:58:00.355863 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 17 17:58:00.355874 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 17 17:58:00.355885 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 17 17:58:00.355896 kernel: Spectre V2 : Mitigation: Retpolines Mar 17 17:58:00.355908 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 17 17:58:00.355922 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 17 17:58:00.355933 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Mar 17 17:58:00.355944 kernel: RETBleed: Mitigation: untrained return thunk Mar 17 17:58:00.355954 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 17 17:58:00.355963 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 17 17:58:00.355974 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Mar 17 17:58:00.355986 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Mar 17 17:58:00.356005 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Mar 17 17:58:00.356020 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 17 17:58:00.356031 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 17 17:58:00.356042 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 17 17:58:00.356053 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 17 17:58:00.356065 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 17 17:58:00.356076 kernel: Freeing SMP alternatives memory: 32K Mar 17 17:58:00.356087 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:58:00.356097 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:58:00.356108 kernel: landlock: Up and running. Mar 17 17:58:00.356123 kernel: SELinux: Initializing. Mar 17 17:58:00.356134 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:58:00.356145 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:58:00.356156 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Mar 17 17:58:00.356167 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 17 17:58:00.356178 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 17 17:58:00.356189 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 17 17:58:00.356200 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 17 17:58:00.356210 kernel: ... version: 0 Mar 17 17:58:00.356222 kernel: ... bit width: 48 Mar 17 17:58:00.356232 kernel: ... generic registers: 6 Mar 17 17:58:00.356244 kernel: ... value mask: 0000ffffffffffff Mar 17 17:58:00.356255 kernel: ... max period: 00007fffffffffff Mar 17 17:58:00.356266 kernel: ... fixed-purpose events: 0 Mar 17 17:58:00.356277 kernel: ... event mask: 000000000000003f Mar 17 17:58:00.356288 kernel: signal: max sigframe size: 1776 Mar 17 17:58:00.356299 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:58:00.356310 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:58:00.356324 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:58:00.356334 kernel: smpboot: x86: Booting SMP configuration: Mar 17 17:58:00.356345 kernel: .... node #0, CPUs: #1 #2 #3 Mar 17 17:58:00.356356 kernel: smp: Brought up 1 node, 4 CPUs Mar 17 17:58:00.356368 kernel: smpboot: Max logical packages: 1 Mar 17 17:58:00.356379 kernel: smpboot: Total of 4 processors activated (22358.00 BogoMIPS) Mar 17 17:58:00.356390 kernel: devtmpfs: initialized Mar 17 17:58:00.356401 kernel: x86/mm: Memory block size: 128MB Mar 17 17:58:00.356412 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:58:00.356423 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 17 17:58:00.356437 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:58:00.356448 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:58:00.356460 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:58:00.356471 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:58:00.356482 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 17 17:58:00.356493 kernel: cpuidle: using governor menu Mar 17 17:58:00.356504 kernel: audit: type=2000 audit(1742234278.682:1): state=initialized audit_enabled=0 res=1 Mar 17 17:58:00.356551 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:58:00.356570 kernel: dca service started, version 1.12.1 Mar 17 17:58:00.356596 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 17 17:58:00.356607 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 17 17:58:00.356619 kernel: PCI: Using configuration type 1 for base access Mar 17 17:58:00.356630 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 17 17:58:00.356642 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:58:00.356653 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:58:00.356665 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:58:00.356676 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:58:00.356690 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:58:00.356701 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:58:00.356712 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:58:00.356724 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:58:00.356735 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:58:00.356746 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 17 17:58:00.356757 kernel: ACPI: Interpreter enabled Mar 17 17:58:00.356768 kernel: ACPI: PM: (supports S0 S3 S5) Mar 17 17:58:00.356779 kernel: ACPI: Using IOAPIC for interrupt routing Mar 17 17:58:00.356790 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 17 17:58:00.356805 kernel: PCI: Using E820 reservations for host bridge windows Mar 17 17:58:00.356816 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 17 17:58:00.356827 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 17 17:58:00.357132 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 17:58:00.357305 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 17 17:58:00.357467 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 17 17:58:00.357482 kernel: PCI host bridge to bus 0000:00 Mar 17 17:58:00.357712 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 17 17:58:00.357867 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 17 17:58:00.358050 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 17 17:58:00.358197 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Mar 17 17:58:00.358346 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 17 17:58:00.360756 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Mar 17 17:58:00.360922 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 17 17:58:00.361147 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 17 17:58:00.361322 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 Mar 17 17:58:00.361487 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfd000000-0xfdffffff pref] Mar 17 17:58:00.361715 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfebd0000-0xfebd0fff] Mar 17 17:58:00.361880 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfebc0000-0xfebcffff pref] Mar 17 17:58:00.362089 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 17 17:58:00.362265 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 Mar 17 17:58:00.362426 kernel: pci 0000:00:02.0: reg 0x10: [io 0xc0c0-0xc0df] Mar 17 17:58:00.362652 kernel: pci 0000:00:02.0: reg 0x14: [mem 0xfebd1000-0xfebd1fff] Mar 17 17:58:00.362816 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfe000000-0xfe003fff 64bit pref] Mar 17 17:58:00.363006 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 Mar 17 17:58:00.363169 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc000-0xc07f] Mar 17 17:58:00.363331 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfebd2000-0xfebd2fff] Mar 17 17:58:00.363496 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe004000-0xfe007fff 64bit pref] Mar 17 17:58:00.363748 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 Mar 17 17:58:00.363927 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc0e0-0xc0ff] Mar 17 17:58:00.364155 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfebd3000-0xfebd3fff] Mar 17 17:58:00.364325 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe008000-0xfe00bfff 64bit pref] Mar 17 17:58:00.364496 kernel: pci 0000:00:04.0: reg 0x30: [mem 0xfeb80000-0xfebbffff pref] Mar 17 17:58:00.368325 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 17 17:58:00.368566 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 17 17:58:00.368815 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 17 17:58:00.368987 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc100-0xc11f] Mar 17 17:58:00.369155 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfebd4000-0xfebd4fff] Mar 17 17:58:00.369332 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 17 17:58:00.369547 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 17 17:58:00.369573 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 17 17:58:00.369613 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 17 17:58:00.369624 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 17 17:58:00.369636 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 17 17:58:00.369647 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 17 17:58:00.369658 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 17 17:58:00.369669 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 17 17:58:00.369681 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 17 17:58:00.369693 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 17 17:58:00.369708 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 17 17:58:00.369719 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 17 17:58:00.369730 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 17 17:58:00.369742 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 17 17:58:00.369754 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 17 17:58:00.369766 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 17 17:58:00.369778 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 17 17:58:00.369789 kernel: iommu: Default domain type: Translated Mar 17 17:58:00.369801 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 17 17:58:00.369818 kernel: PCI: Using ACPI for IRQ routing Mar 17 17:58:00.369829 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 17 17:58:00.369840 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 17 17:58:00.369852 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Mar 17 17:58:00.370050 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 17 17:58:00.370230 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 17 17:58:00.370406 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 17 17:58:00.370437 kernel: vgaarb: loaded Mar 17 17:58:00.370455 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 17 17:58:00.370467 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 17 17:58:00.370479 kernel: clocksource: Switched to clocksource kvm-clock Mar 17 17:58:00.370491 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:58:00.370502 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:58:00.370561 kernel: pnp: PnP ACPI init Mar 17 17:58:00.370779 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 17 17:58:00.370799 kernel: pnp: PnP ACPI: found 6 devices Mar 17 17:58:00.370812 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 17 17:58:00.370829 kernel: NET: Registered PF_INET protocol family Mar 17 17:58:00.370842 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:58:00.370854 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:58:00.370865 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:58:00.370876 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:58:00.370887 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:58:00.370898 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:58:00.370909 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:58:00.370925 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:58:00.370936 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:58:00.370947 kernel: NET: Registered PF_XDP protocol family Mar 17 17:58:00.371158 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 17 17:58:00.371319 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 17 17:58:00.371476 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 17 17:58:00.371695 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Mar 17 17:58:00.371852 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 17 17:58:00.372039 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Mar 17 17:58:00.372066 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:58:00.372079 kernel: Initialise system trusted keyrings Mar 17 17:58:00.372090 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:58:00.372101 kernel: Key type asymmetric registered Mar 17 17:58:00.372111 kernel: Asymmetric key parser 'x509' registered Mar 17 17:58:00.372122 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 17 17:58:00.372134 kernel: io scheduler mq-deadline registered Mar 17 17:58:00.372145 kernel: io scheduler kyber registered Mar 17 17:58:00.372156 kernel: io scheduler bfq registered Mar 17 17:58:00.372170 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 17 17:58:00.372182 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 17 17:58:00.372193 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 17 17:58:00.372203 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 17 17:58:00.372214 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:58:00.372226 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 17 17:58:00.372236 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 17 17:58:00.372247 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 17 17:58:00.372258 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 17 17:58:00.372431 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 17 17:58:00.372610 kernel: rtc_cmos 00:04: registered as rtc0 Mar 17 17:58:00.372775 kernel: rtc_cmos 00:04: setting system clock to 2025-03-17T17:57:59 UTC (1742234279) Mar 17 17:58:00.372951 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 17 17:58:00.372974 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 17 17:58:00.372989 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 17 17:58:00.373019 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:58:00.373033 kernel: Segment Routing with IPv6 Mar 17 17:58:00.373054 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:58:00.373069 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:58:00.373085 kernel: Key type dns_resolver registered Mar 17 17:58:00.373099 kernel: IPI shorthand broadcast: enabled Mar 17 17:58:00.373113 kernel: sched_clock: Marking stable (961003936, 134123933)->(1277335163, -182207294) Mar 17 17:58:00.373127 kernel: registered taskstats version 1 Mar 17 17:58:00.373142 kernel: Loading compiled-in X.509 certificates Mar 17 17:58:00.373155 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 2d438fc13e28f87f3f580874887bade2e2b0c7dd' Mar 17 17:58:00.373166 kernel: Key type .fscrypt registered Mar 17 17:58:00.373181 kernel: Key type fscrypt-provisioning registered Mar 17 17:58:00.373192 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:58:00.373203 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:58:00.373213 kernel: ima: No architecture policies found Mar 17 17:58:00.373225 kernel: clk: Disabling unused clocks Mar 17 17:58:00.373237 kernel: Freeing unused kernel image (initmem) memory: 43476K Mar 17 17:58:00.373248 kernel: Write protecting the kernel read-only data: 38912k Mar 17 17:58:00.373259 kernel: Freeing unused kernel image (rodata/data gap) memory: 1716K Mar 17 17:58:00.373271 kernel: Run /init as init process Mar 17 17:58:00.373286 kernel: with arguments: Mar 17 17:58:00.373298 kernel: /init Mar 17 17:58:00.373309 kernel: with environment: Mar 17 17:58:00.373321 kernel: HOME=/ Mar 17 17:58:00.373333 kernel: TERM=linux Mar 17 17:58:00.373344 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:58:00.373358 systemd[1]: Successfully made /usr/ read-only. Mar 17 17:58:00.373375 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 17:58:00.373392 systemd[1]: Detected virtualization kvm. Mar 17 17:58:00.373405 systemd[1]: Detected architecture x86-64. Mar 17 17:58:00.373418 systemd[1]: Running in initrd. Mar 17 17:58:00.373431 systemd[1]: No hostname configured, using default hostname. Mar 17 17:58:00.373444 systemd[1]: Hostname set to . Mar 17 17:58:00.373457 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:58:00.373470 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:58:00.373483 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:58:00.373499 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:58:00.373529 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:58:00.373544 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:58:00.373555 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:58:00.373568 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:58:00.373602 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:58:00.373614 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:58:00.373626 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:58:00.373639 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:58:00.373652 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:58:00.373665 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:58:00.373678 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:58:00.373691 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:58:00.373707 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:58:00.373718 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:58:00.373730 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:58:00.373742 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 17 17:58:00.373754 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:58:00.373765 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:58:00.373777 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:58:00.373790 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:58:00.373806 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:58:00.373818 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:58:00.373831 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:58:00.373844 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:58:00.373856 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:58:00.373869 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:58:00.373883 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:58:00.373896 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:58:00.373944 systemd-journald[194]: Collecting audit messages is disabled. Mar 17 17:58:00.374001 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:58:00.374027 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:58:00.374044 systemd-journald[194]: Journal started Mar 17 17:58:00.374084 systemd-journald[194]: Runtime Journal (/run/log/journal/416dcbd3aaba48fcb4bfd9d937bc5b75) is 6M, max 48.4M, 42.3M free. Mar 17 17:58:00.393037 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:58:00.390156 systemd-modules-load[195]: Inserted module 'overlay' Mar 17 17:58:00.433605 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:58:00.436746 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:58:00.450165 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:58:00.491624 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:58:00.498967 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:58:00.503037 kernel: Bridge firewalling registered Mar 17 17:58:00.505348 systemd-modules-load[195]: Inserted module 'br_netfilter' Mar 17 17:58:00.510717 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:58:00.512827 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:58:00.513215 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:58:00.532596 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:58:00.561974 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:58:00.572907 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:58:00.578928 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:58:00.589234 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:58:00.593121 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:58:00.607850 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:58:00.641357 dracut-cmdline[233]: dracut-dracut-053 Mar 17 17:58:00.645470 dracut-cmdline[233]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=2a4a0f64c0160ed10b339be09fdc9d7e265b13f78aefc87616e79bf13c00bb1c Mar 17 17:58:00.691915 systemd-resolved[230]: Positive Trust Anchors: Mar 17 17:58:00.691940 systemd-resolved[230]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:58:00.696072 systemd-resolved[230]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:58:00.704244 systemd-resolved[230]: Defaulting to hostname 'linux'. Mar 17 17:58:00.710185 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:58:00.731150 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:58:00.842620 kernel: SCSI subsystem initialized Mar 17 17:58:00.868378 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:58:00.958554 kernel: iscsi: registered transport (tcp) Mar 17 17:58:01.009170 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:58:01.009253 kernel: QLogic iSCSI HBA Driver Mar 17 17:58:01.168623 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:58:01.184013 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:58:01.239066 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:58:01.239162 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:58:01.239178 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:58:01.348650 kernel: raid6: avx2x4 gen() 16579 MB/s Mar 17 17:58:01.364649 kernel: raid6: avx2x2 gen() 15360 MB/s Mar 17 17:58:01.382534 kernel: raid6: avx2x1 gen() 13958 MB/s Mar 17 17:58:01.382646 kernel: raid6: using algorithm avx2x4 gen() 16579 MB/s Mar 17 17:58:01.401544 kernel: raid6: .... xor() 4729 MB/s, rmw enabled Mar 17 17:58:01.401638 kernel: raid6: using avx2x2 recovery algorithm Mar 17 17:58:01.449162 kernel: xor: automatically using best checksumming function avx Mar 17 17:58:01.794635 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:58:01.824946 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:58:01.853956 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:58:01.877845 systemd-udevd[415]: Using default interface naming scheme 'v255'. Mar 17 17:58:01.888706 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:58:01.909877 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:58:01.942098 dracut-pre-trigger[417]: rd.md=0: removing MD RAID activation Mar 17 17:58:02.027811 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:58:02.048971 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:58:02.180191 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:58:02.221403 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:58:02.245210 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:58:02.249129 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:58:02.252923 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:58:02.256260 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:58:02.262617 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Mar 17 17:58:02.314127 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 17 17:58:02.314355 kernel: cryptd: max_cpu_qlen set to 1000 Mar 17 17:58:02.314372 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 17 17:58:02.314396 kernel: GPT:9289727 != 19775487 Mar 17 17:58:02.314410 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 17 17:58:02.314424 kernel: GPT:9289727 != 19775487 Mar 17 17:58:02.314437 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 17 17:58:02.314451 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:58:02.279178 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:58:02.304035 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:58:02.334779 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:58:02.335028 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:58:02.340858 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:58:02.342371 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:58:02.344259 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:58:02.356495 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:58:02.373240 kernel: libata version 3.00 loaded. Mar 17 17:58:02.382016 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:58:02.384058 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 17 17:58:02.425986 kernel: AVX2 version of gcm_enc/dec engaged. Mar 17 17:58:02.430597 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (475) Mar 17 17:58:02.473616 kernel: AES CTR mode by8 optimization enabled Mar 17 17:58:02.476600 kernel: BTRFS: device fsid 16b3954e-2e86-4c7f-a948-d3d3817b1bdc devid 1 transid 42 /dev/vda3 scanned by (udev-worker) (461) Mar 17 17:58:02.496178 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 17 17:58:02.522714 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:58:02.535714 kernel: ahci 0000:00:1f.2: version 3.0 Mar 17 17:58:02.559212 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 17 17:58:02.559241 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 17 17:58:02.559463 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 17 17:58:02.559675 kernel: scsi host0: ahci Mar 17 17:58:02.559857 kernel: scsi host1: ahci Mar 17 17:58:02.560045 kernel: scsi host2: ahci Mar 17 17:58:02.560237 kernel: scsi host3: ahci Mar 17 17:58:02.560421 kernel: scsi host4: ahci Mar 17 17:58:02.560595 kernel: scsi host5: ahci Mar 17 17:58:02.560760 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 Mar 17 17:58:02.560774 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 Mar 17 17:58:02.560793 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 Mar 17 17:58:02.560805 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 Mar 17 17:58:02.560818 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 Mar 17 17:58:02.560830 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 Mar 17 17:58:02.542161 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 17 17:58:02.576313 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 17 17:58:02.580841 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 17 17:58:02.597882 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 17 17:58:02.615889 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:58:02.625848 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:58:02.647025 disk-uuid[559]: Primary Header is updated. Mar 17 17:58:02.647025 disk-uuid[559]: Secondary Entries is updated. Mar 17 17:58:02.647025 disk-uuid[559]: Secondary Header is updated. Mar 17 17:58:02.671268 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:58:02.677998 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:58:02.875639 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 17 17:58:02.877612 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 17 17:58:02.877702 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 17 17:58:02.877718 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 17 17:58:02.878604 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 17 17:58:02.879622 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 17 17:58:02.881047 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 17 17:58:02.881070 kernel: ata3.00: applying bridge limits Mar 17 17:58:02.882608 kernel: ata3.00: configured for UDMA/100 Mar 17 17:58:02.884607 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 17 17:58:02.940766 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 17 17:58:02.953647 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 17:58:02.953671 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 17 17:58:03.692319 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:58:03.698491 disk-uuid[560]: The operation has completed successfully. Mar 17 17:58:03.814760 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:58:03.816432 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:58:03.895924 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:58:03.920469 sh[596]: Success Mar 17 17:58:03.987765 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 17 17:58:04.068704 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:58:04.093002 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:58:04.104527 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:58:04.159629 kernel: BTRFS info (device dm-0): first mount of filesystem 16b3954e-2e86-4c7f-a948-d3d3817b1bdc Mar 17 17:58:04.159709 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:58:04.159725 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:58:04.162943 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:58:04.169340 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:58:04.178550 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:58:04.184280 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:58:04.201775 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:58:04.225222 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:58:04.251617 kernel: BTRFS info (device vda6): first mount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 17:58:04.251696 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:58:04.251712 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:58:04.267708 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:58:04.290601 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:58:04.296348 kernel: BTRFS info (device vda6): last unmount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 17:58:04.415864 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:58:04.441178 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:58:04.543079 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:58:04.569150 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:58:04.623456 ignition[739]: Ignition 2.20.0 Mar 17 17:58:04.623470 ignition[739]: Stage: fetch-offline Mar 17 17:58:04.623524 ignition[739]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:58:04.623535 ignition[739]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:58:04.623650 ignition[739]: parsed url from cmdline: "" Mar 17 17:58:04.623655 ignition[739]: no config URL provided Mar 17 17:58:04.623661 ignition[739]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:58:04.623671 ignition[739]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:58:04.623701 ignition[739]: op(1): [started] loading QEMU firmware config module Mar 17 17:58:04.623707 ignition[739]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 17 17:58:04.640958 ignition[739]: op(1): [finished] loading QEMU firmware config module Mar 17 17:58:04.680305 systemd-networkd[779]: lo: Link UP Mar 17 17:58:04.680320 systemd-networkd[779]: lo: Gained carrier Mar 17 17:58:04.682620 systemd-networkd[779]: Enumeration completed Mar 17 17:58:04.683095 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:58:04.683100 systemd-networkd[779]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:58:04.683974 systemd-networkd[779]: eth0: Link UP Mar 17 17:58:04.683979 systemd-networkd[779]: eth0: Gained carrier Mar 17 17:58:04.749890 ignition[739]: parsing config with SHA512: bc71d026ecb99d2c4b338f3e28455008956a3ef8c6f0117ec146d4a6c9403fbe7c343aaf4f83d51e7866b8f93619f74dd5198eeec5cc5fae68b4f6955ef6bf62 Mar 17 17:58:04.683988 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:58:04.684987 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:58:04.764516 ignition[739]: fetch-offline: fetch-offline passed Mar 17 17:58:04.702005 systemd[1]: Reached target network.target - Network. Mar 17 17:58:04.764729 ignition[739]: Ignition finished successfully Mar 17 17:58:04.742705 systemd-networkd[779]: eth0: DHCPv4 address 10.0.0.118/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 17:58:04.763470 unknown[739]: fetched base config from "system" Mar 17 17:58:04.763491 unknown[739]: fetched user config from "qemu" Mar 17 17:58:04.769329 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:58:04.773340 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 17 17:58:04.786569 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:58:04.825489 ignition[792]: Ignition 2.20.0 Mar 17 17:58:04.825502 ignition[792]: Stage: kargs Mar 17 17:58:04.825720 ignition[792]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:58:04.825734 ignition[792]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:58:04.826755 ignition[792]: kargs: kargs passed Mar 17 17:58:04.826812 ignition[792]: Ignition finished successfully Mar 17 17:58:04.847772 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:58:04.867974 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:58:04.905758 ignition[801]: Ignition 2.20.0 Mar 17 17:58:04.905777 ignition[801]: Stage: disks Mar 17 17:58:04.906016 ignition[801]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:58:04.906032 ignition[801]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:58:04.908261 ignition[801]: disks: disks passed Mar 17 17:58:04.908322 ignition[801]: Ignition finished successfully Mar 17 17:58:04.930394 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:58:04.936723 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:58:04.939548 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:58:04.950672 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:58:04.952078 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:58:04.956096 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:58:04.971242 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:58:05.005960 systemd-fsck[812]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 17 17:58:05.019330 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:58:05.165820 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:58:05.420983 kernel: EXT4-fs (vda9): mounted filesystem 21764504-a65e-45eb-84e1-376b55b62aba r/w with ordered data mode. Quota mode: none. Mar 17 17:58:05.425251 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:58:05.432076 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:58:05.466512 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:58:05.509248 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:58:05.549497 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (820) Mar 17 17:58:05.549528 kernel: BTRFS info (device vda6): first mount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 17:58:05.549543 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:58:05.549556 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:58:05.527351 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 17 17:58:05.527436 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:58:05.565989 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:58:05.527478 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:58:05.570219 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:58:05.579497 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:58:05.602911 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:58:05.695445 initrd-setup-root[844]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:58:05.709052 initrd-setup-root[851]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:58:05.734683 initrd-setup-root[858]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:58:05.763231 initrd-setup-root[865]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:58:05.977940 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:58:05.993978 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:58:05.997051 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:58:06.018673 kernel: BTRFS info (device vda6): last unmount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 17:58:06.115461 ignition[933]: INFO : Ignition 2.20.0 Mar 17 17:58:06.115461 ignition[933]: INFO : Stage: mount Mar 17 17:58:06.115461 ignition[933]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:58:06.115461 ignition[933]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:58:06.135644 ignition[933]: INFO : mount: mount passed Mar 17 17:58:06.135644 ignition[933]: INFO : Ignition finished successfully Mar 17 17:58:06.122067 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:58:06.149953 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:58:06.156003 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:58:06.156775 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:58:06.209994 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:58:06.236636 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (946) Mar 17 17:58:06.236724 kernel: BTRFS info (device vda6): first mount of filesystem e64ce651-fa93-44de-893d-ff1e0bc9061f Mar 17 17:58:06.239158 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 17:58:06.239224 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:58:06.256007 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:58:06.262801 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:58:06.329171 ignition[963]: INFO : Ignition 2.20.0 Mar 17 17:58:06.329171 ignition[963]: INFO : Stage: files Mar 17 17:58:06.335931 ignition[963]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:58:06.335931 ignition[963]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:58:06.335931 ignition[963]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:58:06.335931 ignition[963]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:58:06.335931 ignition[963]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:58:06.353912 ignition[963]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:58:06.353912 ignition[963]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:58:06.360624 ignition[963]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:58:06.357641 unknown[963]: wrote ssh authorized keys file for user: core Mar 17 17:58:06.369860 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 17:58:06.369860 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 17 17:58:06.468804 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:58:06.636966 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 17:58:06.636966 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:58:06.649436 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 17 17:58:06.667828 systemd-networkd[779]: eth0: Gained IPv6LL Mar 17 17:58:07.027731 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:58:07.739230 ignition[963]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 17:58:07.739230 ignition[963]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 17 17:58:07.746338 ignition[963]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:58:07.750698 ignition[963]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:58:07.750698 ignition[963]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 17 17:58:07.750698 ignition[963]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 17 17:58:07.776049 ignition[963]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:58:07.776049 ignition[963]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:58:07.776049 ignition[963]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 17 17:58:07.776049 ignition[963]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 17 17:58:07.858114 ignition[963]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:58:07.876596 ignition[963]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:58:07.876596 ignition[963]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 17 17:58:07.876596 ignition[963]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:58:07.876596 ignition[963]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:58:07.900078 ignition[963]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:58:07.900078 ignition[963]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:58:07.900078 ignition[963]: INFO : files: files passed Mar 17 17:58:07.900078 ignition[963]: INFO : Ignition finished successfully Mar 17 17:58:07.894598 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:58:07.932867 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:58:07.940849 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:58:07.963246 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:58:07.963399 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:58:07.975229 initrd-setup-root-after-ignition[990]: grep: /sysroot/oem/oem-release: No such file or directory Mar 17 17:58:07.983881 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:58:07.983881 initrd-setup-root-after-ignition[993]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:58:07.990129 initrd-setup-root-after-ignition[997]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:58:07.995512 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:58:08.010203 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:58:08.050218 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:58:08.153178 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:58:08.154689 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:58:08.160999 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:58:08.168213 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:58:08.168358 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:58:08.178622 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:58:08.226685 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:58:08.251874 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:58:08.310798 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:58:08.314807 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:58:08.322995 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:58:08.325755 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:58:08.325975 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:58:08.329299 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:58:08.334842 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:58:08.343433 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:58:08.350181 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:58:08.359296 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:58:08.359612 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:58:08.360021 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:58:08.360451 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:58:08.361433 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:58:08.364746 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:58:08.366779 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:58:08.369650 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:58:08.370036 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:58:08.370458 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:58:08.370968 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:58:08.371262 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:58:08.372973 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:58:08.373148 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:58:08.373911 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:58:08.374064 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:58:08.374603 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:58:08.375054 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:58:08.378727 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:58:08.379464 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:58:08.474288 ignition[1017]: INFO : Ignition 2.20.0 Mar 17 17:58:08.474288 ignition[1017]: INFO : Stage: umount Mar 17 17:58:08.474288 ignition[1017]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:58:08.474288 ignition[1017]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:58:08.380024 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:58:08.523227 ignition[1017]: INFO : umount: umount passed Mar 17 17:58:08.523227 ignition[1017]: INFO : Ignition finished successfully Mar 17 17:58:08.380394 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:58:08.380529 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:58:08.381189 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:58:08.385980 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:58:08.393935 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:58:08.394121 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:58:08.401191 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:58:08.401376 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:58:08.442094 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:58:08.498025 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:58:08.498289 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:58:08.499603 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:58:08.499664 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:58:08.499822 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:58:08.500001 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:58:08.500149 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:58:08.502728 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:58:08.507279 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:58:08.535344 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:58:08.535556 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:58:08.539750 systemd[1]: Stopped target network.target - Network. Mar 17 17:58:08.546781 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:58:08.546916 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:58:08.556934 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:58:08.557042 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:58:08.563032 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:58:08.563134 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:58:08.565820 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:58:08.565908 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:58:08.588309 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:58:08.590765 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:58:08.608571 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:58:08.624244 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:58:08.624423 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:58:08.649909 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 17 17:58:08.650361 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:58:08.650539 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:58:08.676141 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 17 17:58:08.679375 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:58:08.679444 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:58:08.697013 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:58:08.703146 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:58:08.703279 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:58:08.706161 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:58:08.706253 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:58:08.714129 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:58:08.714214 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:58:08.720339 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:58:08.720443 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:58:08.730211 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:58:08.741403 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 17:58:08.741515 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 17 17:58:08.803284 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:58:08.803454 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:58:08.813347 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:58:08.813603 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:58:08.821152 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:58:08.821250 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:58:08.822892 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:58:08.822953 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:58:08.828488 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:58:08.828612 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:58:08.830227 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:58:08.830304 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:58:08.838137 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:58:08.839589 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:58:08.878400 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:58:08.891135 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:58:08.891251 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:58:08.904402 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:58:08.904495 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:58:08.910206 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 17 17:58:08.910294 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 17 17:58:08.910893 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:58:08.911043 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:58:08.939328 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:58:08.939529 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:58:08.942165 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:58:08.944007 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:58:08.944098 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:58:08.973921 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:58:08.989232 systemd[1]: Switching root. Mar 17 17:58:09.040770 systemd-journald[194]: Journal stopped Mar 17 17:58:11.491239 systemd-journald[194]: Received SIGTERM from PID 1 (systemd). Mar 17 17:58:11.491345 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 17:58:11.491373 kernel: SELinux: policy capability open_perms=1 Mar 17 17:58:11.491390 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 17:58:11.491414 kernel: SELinux: policy capability always_check_network=0 Mar 17 17:58:11.491430 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 17:58:11.491449 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 17:58:11.491471 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 17:58:11.491486 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 17:58:11.491502 kernel: audit: type=1403 audit(1742234289.551:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 17:58:11.491526 systemd[1]: Successfully loaded SELinux policy in 68.377ms. Mar 17 17:58:11.491551 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 21.587ms. Mar 17 17:58:11.491571 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 17:58:11.491611 systemd[1]: Detected virtualization kvm. Mar 17 17:58:11.491629 systemd[1]: Detected architecture x86-64. Mar 17 17:58:11.491646 systemd[1]: Detected first boot. Mar 17 17:58:11.491663 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:58:11.491691 zram_generator::config[1084]: No configuration found. Mar 17 17:58:11.491710 kernel: Guest personality initialized and is inactive Mar 17 17:58:11.491726 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 17 17:58:11.491741 kernel: Initialized host personality Mar 17 17:58:11.491762 kernel: NET: Registered PF_VSOCK protocol family Mar 17 17:58:11.491779 systemd[1]: Populated /etc with preset unit settings. Mar 17 17:58:11.491798 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 17 17:58:11.491815 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 17 17:58:11.491831 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 17 17:58:11.491848 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 17 17:58:11.491868 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 17:58:11.491886 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 17:58:11.491902 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 17:58:11.491925 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 17:58:11.491943 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 17:58:11.491961 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 17:58:11.491979 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 17:58:11.491995 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 17:58:11.492013 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:58:11.492030 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:58:11.492048 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 17:58:11.492065 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 17:58:11.492086 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 17:58:11.492104 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:58:11.492121 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 17 17:58:11.492145 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:58:11.492162 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 17 17:58:11.492179 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 17 17:58:11.492196 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 17 17:58:11.492219 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 17:58:11.492237 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:58:11.492257 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:58:11.492275 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:58:11.492293 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:58:11.492310 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 17:58:11.492328 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 17:58:11.492345 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 17 17:58:11.492362 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:58:11.492384 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:58:11.492401 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:58:11.492417 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 17:58:11.492434 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 17:58:11.492452 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 17:58:11.492468 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 17:58:11.492492 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:58:11.492509 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 17:58:11.492527 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 17:58:11.492548 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 17:58:11.492567 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 17:58:11.492602 systemd[1]: Reached target machines.target - Containers. Mar 17 17:58:11.492620 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 17:58:11.492638 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:58:11.492659 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:58:11.492686 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 17:58:11.492703 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:58:11.492725 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:58:11.492742 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:58:11.492759 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 17:58:11.492775 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:58:11.492793 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 17:58:11.492811 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 17 17:58:11.492828 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 17 17:58:11.492846 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 17 17:58:11.492864 systemd[1]: Stopped systemd-fsck-usr.service. Mar 17 17:58:11.492887 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:58:11.492905 kernel: fuse: init (API version 7.39) Mar 17 17:58:11.492922 kernel: loop: module loaded Mar 17 17:58:11.492939 kernel: ACPI: bus type drm_connector registered Mar 17 17:58:11.492956 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:58:11.492979 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:58:11.493001 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 17:58:11.493022 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 17:58:11.493043 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 17 17:58:11.493073 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:58:11.493093 systemd[1]: verity-setup.service: Deactivated successfully. Mar 17 17:58:11.493112 systemd[1]: Stopped verity-setup.service. Mar 17 17:58:11.493185 systemd-journald[1170]: Collecting audit messages is disabled. Mar 17 17:58:11.493224 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:58:11.493244 systemd-journald[1170]: Journal started Mar 17 17:58:11.493275 systemd-journald[1170]: Runtime Journal (/run/log/journal/416dcbd3aaba48fcb4bfd9d937bc5b75) is 6M, max 48.4M, 42.3M free. Mar 17 17:58:10.797243 systemd[1]: Queued start job for default target multi-user.target. Mar 17 17:58:10.814683 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 17 17:58:10.815467 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 17 17:58:11.510087 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:58:11.511164 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 17:58:11.515470 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 17:58:11.517215 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 17:58:11.522057 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 17:58:11.527427 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 17:58:11.528970 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 17:58:11.531141 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 17:58:11.535995 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:58:11.538201 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 17:58:11.538544 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 17:58:11.544305 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:58:11.544663 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:58:11.549599 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:58:11.549977 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:58:11.552224 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:58:11.552547 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:58:11.554700 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 17:58:11.555016 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 17:58:11.556959 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:58:11.557261 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:58:11.559780 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:58:11.561858 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 17:58:11.566012 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 17:58:11.571002 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 17 17:58:11.603771 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 17:58:11.625828 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 17:58:11.643798 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 17:58:11.657231 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 17:58:11.657300 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:58:11.664183 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 17 17:58:11.688502 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 17:58:11.695288 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 17:58:11.700113 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:58:11.706066 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 17:58:11.714288 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 17:58:11.716832 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:58:11.722001 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 17:58:11.723754 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:58:11.738968 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:58:11.741147 systemd-journald[1170]: Time spent on flushing to /var/log/journal/416dcbd3aaba48fcb4bfd9d937bc5b75 is 40.333ms for 963 entries. Mar 17 17:58:11.741147 systemd-journald[1170]: System Journal (/var/log/journal/416dcbd3aaba48fcb4bfd9d937bc5b75) is 8M, max 195.6M, 187.6M free. Mar 17 17:58:11.840937 systemd-journald[1170]: Received client request to flush runtime journal. Mar 17 17:58:11.841012 kernel: loop0: detected capacity change from 0 to 138176 Mar 17 17:58:11.744609 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 17:58:11.753531 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 17:58:11.764108 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:58:11.773502 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 17:58:11.783476 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 17:58:11.785602 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 17:58:11.789952 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 17:58:11.803191 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 17:58:11.828067 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 17 17:58:11.837208 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 17:58:11.847790 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 17:58:11.864994 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:58:11.881397 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 17:58:11.885500 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 17 17:58:11.889705 udevadm[1217]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 17 17:58:11.912650 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 17:58:11.917617 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 17:58:11.939945 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:58:11.966625 kernel: loop1: detected capacity change from 0 to 210664 Mar 17 17:58:11.986993 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Mar 17 17:58:11.987023 systemd-tmpfiles[1225]: ACLs are not supported, ignoring. Mar 17 17:58:12.001980 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:58:12.045620 kernel: loop2: detected capacity change from 0 to 147912 Mar 17 17:58:12.119627 kernel: loop3: detected capacity change from 0 to 138176 Mar 17 17:58:12.189631 kernel: loop4: detected capacity change from 0 to 210664 Mar 17 17:58:12.226622 kernel: loop5: detected capacity change from 0 to 147912 Mar 17 17:58:12.262884 (sd-merge)[1230]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 17 17:58:12.265267 (sd-merge)[1230]: Merged extensions into '/usr'. Mar 17 17:58:12.275005 systemd[1]: Reload requested from client PID 1205 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 17:58:12.275031 systemd[1]: Reloading... Mar 17 17:58:12.408614 zram_generator::config[1258]: No configuration found. Mar 17 17:58:12.603798 ldconfig[1200]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 17:58:12.722661 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:58:12.860018 systemd[1]: Reloading finished in 584 ms. Mar 17 17:58:12.883473 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 17:58:12.888507 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 17:58:12.919010 systemd[1]: Starting ensure-sysext.service... Mar 17 17:58:12.924998 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:58:12.958531 systemd[1]: Reload requested from client PID 1295 ('systemctl') (unit ensure-sysext.service)... Mar 17 17:58:12.958552 systemd[1]: Reloading... Mar 17 17:58:13.016485 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 17:58:13.018485 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 17:58:13.028095 systemd-tmpfiles[1296]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 17:58:13.038508 systemd-tmpfiles[1296]: ACLs are not supported, ignoring. Mar 17 17:58:13.038649 systemd-tmpfiles[1296]: ACLs are not supported, ignoring. Mar 17 17:58:13.089639 zram_generator::config[1331]: No configuration found. Mar 17 17:58:13.100439 systemd-tmpfiles[1296]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:58:13.100459 systemd-tmpfiles[1296]: Skipping /boot Mar 17 17:58:13.120115 systemd-tmpfiles[1296]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:58:13.120140 systemd-tmpfiles[1296]: Skipping /boot Mar 17 17:58:13.321668 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:58:13.433344 systemd[1]: Reloading finished in 474 ms. Mar 17 17:58:13.462859 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 17:58:13.497409 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:58:13.532817 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:58:13.546001 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 17:58:13.567005 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 17:58:13.583112 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:58:13.596102 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:58:13.611919 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 17:58:13.640233 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 17:58:13.659409 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:58:13.660095 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:58:13.682253 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:58:13.689569 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:58:13.696310 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:58:13.697991 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:58:13.698287 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:58:13.709446 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 17:58:13.719521 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 17:58:13.721219 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:58:13.725439 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:58:13.725840 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:58:13.733327 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:58:13.741135 augenrules[1395]: No rules Mar 17 17:58:13.733742 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:58:13.741837 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:58:13.742211 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:58:13.744889 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:58:13.745200 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:58:13.746266 systemd-udevd[1374]: Using default interface naming scheme 'v255'. Mar 17 17:58:13.774015 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 17:58:13.777413 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 17:58:13.800396 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:58:13.827011 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:58:13.832230 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:58:13.852482 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:58:13.865358 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:58:13.874347 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:58:13.882991 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:58:13.887044 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:58:13.887100 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:58:13.887194 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 17:58:13.888015 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:58:13.889980 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 17:58:13.892023 systemd[1]: Finished ensure-sysext.service. Mar 17 17:58:13.896202 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 17:58:13.901728 augenrules[1406]: /sbin/augenrules: No change Mar 17 17:58:13.903311 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:58:13.903759 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:58:13.906333 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:58:13.906685 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:58:13.913127 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:58:13.913682 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:58:13.922564 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:58:13.922820 augenrules[1440]: No rules Mar 17 17:58:13.923004 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:58:13.929887 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:58:13.930303 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:58:13.972157 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:58:13.978922 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:58:13.979054 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:58:13.986539 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 17 17:58:13.995749 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 17:58:14.059979 systemd-resolved[1373]: Positive Trust Anchors: Mar 17 17:58:14.060388 systemd-resolved[1373]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:58:14.060496 systemd-resolved[1373]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:58:14.074665 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 42 scanned by (udev-worker) (1458) Mar 17 17:58:14.081138 systemd-resolved[1373]: Defaulting to hostname 'linux'. Mar 17 17:58:14.087076 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:58:14.088960 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:58:14.093832 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 17 17:58:14.183089 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 17 17:58:14.198883 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 17:58:14.222431 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 17 17:58:14.224536 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 17:58:14.224667 systemd-networkd[1459]: lo: Link UP Mar 17 17:58:14.224673 systemd-networkd[1459]: lo: Gained carrier Mar 17 17:58:14.231241 systemd-networkd[1459]: Enumeration completed Mar 17 17:58:14.231699 systemd-networkd[1459]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:58:14.231705 systemd-networkd[1459]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:58:14.232417 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 17:58:14.234085 systemd-networkd[1459]: eth0: Link UP Mar 17 17:58:14.234092 systemd-networkd[1459]: eth0: Gained carrier Mar 17 17:58:14.234112 systemd-networkd[1459]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:58:14.238771 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:58:14.243372 systemd[1]: Reached target network.target - Network. Mar 17 17:58:14.280648 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 17 17:58:14.279977 systemd-networkd[1459]: eth0: DHCPv4 address 10.0.0.118/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 17:58:14.281197 systemd-timesyncd[1462]: Network configuration changed, trying to establish connection. Mar 17 17:58:14.284802 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 17 17:58:14.285424 systemd-timesyncd[1462]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 17 17:58:14.285492 systemd-timesyncd[1462]: Initial clock synchronization to Mon 2025-03-17 17:58:14.559400 UTC. Mar 17 17:58:14.299945 kernel: ACPI: button: Power Button [PWRF] Mar 17 17:58:14.299812 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 17:58:14.338185 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 17 17:58:14.339004 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 17 17:58:14.339270 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 17 17:58:14.351074 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 17 17:58:14.401473 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:58:14.435678 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 17 17:58:14.653812 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:58:14.775621 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 17:58:14.812947 kernel: kvm_amd: TSC scaling supported Mar 17 17:58:14.813050 kernel: kvm_amd: Nested Virtualization enabled Mar 17 17:58:14.813099 kernel: kvm_amd: Nested Paging enabled Mar 17 17:58:14.817915 kernel: kvm_amd: LBR virtualization supported Mar 17 17:58:14.818627 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Mar 17 17:58:14.819637 kernel: kvm_amd: Virtual GIF supported Mar 17 17:58:15.014395 kernel: EDAC MC: Ver: 3.0.0 Mar 17 17:58:15.076223 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 17:58:15.093013 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 17:58:15.113214 lvm[1493]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:58:15.160804 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 17:58:15.163327 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:58:15.169856 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:58:15.181469 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 17:58:15.186357 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 17:58:15.197580 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 17:58:15.201020 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 17:58:15.203166 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 17:58:15.210531 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 17:58:15.212725 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:58:15.216160 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:58:15.237900 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 17:58:15.253154 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 17:58:15.259707 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 17 17:58:15.273297 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 17 17:58:15.277698 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 17 17:58:15.285807 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 17:58:15.290118 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 17 17:58:15.298917 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 17:58:15.302912 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 17:58:15.304896 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:58:15.311036 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:58:15.312778 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:58:15.312814 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:58:15.316789 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 17:58:15.326634 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 17:58:15.334886 lvm[1497]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:58:15.340814 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 17:58:15.358657 jq[1500]: false Mar 17 17:58:15.362136 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 17:58:15.366177 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 17:58:15.367982 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 17:58:15.372935 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 17 17:58:15.381707 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 17:58:15.393957 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 17:58:15.403551 extend-filesystems[1501]: Found loop3 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found loop4 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found loop5 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found sr0 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda1 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda2 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda3 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found usr Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda4 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda6 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda7 Mar 17 17:58:15.408044 extend-filesystems[1501]: Found vda9 Mar 17 17:58:15.408044 extend-filesystems[1501]: Checking size of /dev/vda9 Mar 17 17:58:15.422081 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 17:58:15.430484 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 17:58:15.431389 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 17:58:15.432507 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 17:58:15.436429 dbus-daemon[1499]: [system] SELinux support is enabled Mar 17 17:58:15.438849 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 17:58:15.443080 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 17:58:15.449715 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 17:58:15.462546 extend-filesystems[1501]: Resized partition /dev/vda9 Mar 17 17:58:15.473628 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 42 scanned by (udev-worker) (1458) Mar 17 17:58:15.473694 update_engine[1512]: I20250317 17:58:15.472043 1512 main.cc:92] Flatcar Update Engine starting Mar 17 17:58:15.470695 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 17:58:15.474127 update_engine[1512]: I20250317 17:58:15.473817 1512 update_check_scheduler.cc:74] Next update check in 10m56s Mar 17 17:58:15.471389 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 17:58:15.471904 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 17:58:15.472262 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 17:58:15.478687 extend-filesystems[1522]: resize2fs 1.47.1 (20-May-2024) Mar 17 17:58:15.484314 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 17:58:15.484691 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 17:58:15.492649 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 17 17:58:15.492744 jq[1513]: true Mar 17 17:58:15.493176 systemd-networkd[1459]: eth0: Gained IPv6LL Mar 17 17:58:15.519701 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 17:58:15.550449 jq[1525]: true Mar 17 17:58:15.564180 (ntainerd)[1527]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 17:58:15.629535 tar[1523]: linux-amd64/helm Mar 17 17:58:15.646663 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 17 17:58:15.660221 systemd[1]: Started update-engine.service - Update Engine. Mar 17 17:58:15.662286 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 17:58:15.676464 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 17 17:58:15.683847 systemd-logind[1507]: Watching system buttons on /dev/input/event1 (Power Button) Mar 17 17:58:15.683887 systemd-logind[1507]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 17 17:58:15.695733 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:15.707097 extend-filesystems[1522]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 17 17:58:15.707097 extend-filesystems[1522]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 17 17:58:15.707097 extend-filesystems[1522]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 17 17:58:15.737915 extend-filesystems[1501]: Resized filesystem in /dev/vda9 Mar 17 17:58:15.741322 bash[1553]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:58:15.714753 systemd-logind[1507]: New seat seat0. Mar 17 17:58:15.725921 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 17:58:15.727668 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 17:58:15.727713 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 17:58:15.729510 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 17:58:15.729536 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 17:58:15.748870 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 17:58:15.751188 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 17:58:15.753690 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 17:58:15.754066 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 17:58:15.756644 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 17:58:15.804045 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 17 17:58:15.812189 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 17:58:15.818534 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 17 17:58:15.823296 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 17 17:58:15.838500 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 17:58:15.882926 sshd_keygen[1524]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 17:58:15.941841 locksmithd[1561]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 17:58:15.950385 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 17:58:15.986889 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 17:58:15.996516 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 17:58:15.996901 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 17:58:16.021230 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 17:58:16.060180 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 17:58:16.082170 containerd[1527]: time="2025-03-17T17:58:16.079700055Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 17:58:16.088335 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 17:58:16.094497 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 17 17:58:16.096574 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 17:58:16.128446 containerd[1527]: time="2025-03-17T17:58:16.128135189Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.131899890Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.131956531Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.131981307Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.132238408Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.132261121Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.132347543Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.132363930Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:58:16.132806 containerd[1527]: time="2025-03-17T17:58:16.132743064Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:58:16.133155 containerd[1527]: time="2025-03-17T17:58:16.132816124Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 17:58:16.133155 containerd[1527]: time="2025-03-17T17:58:16.132888895Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:58:16.133155 containerd[1527]: time="2025-03-17T17:58:16.132908335Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 17:58:16.133155 containerd[1527]: time="2025-03-17T17:58:16.133129298Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:58:16.133546 containerd[1527]: time="2025-03-17T17:58:16.133488363Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:58:16.133907 containerd[1527]: time="2025-03-17T17:58:16.133856899Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:58:16.133907 containerd[1527]: time="2025-03-17T17:58:16.133892583Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 17:58:16.134082 containerd[1527]: time="2025-03-17T17:58:16.134039547Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 17:58:16.134174 containerd[1527]: time="2025-03-17T17:58:16.134133854Z" level=info msg="metadata content store policy set" policy=shared Mar 17 17:58:16.160291 containerd[1527]: time="2025-03-17T17:58:16.160233487Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 17:58:16.160695 containerd[1527]: time="2025-03-17T17:58:16.160545631Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 17:58:16.160695 containerd[1527]: time="2025-03-17T17:58:16.160616719Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 17:58:16.160695 containerd[1527]: time="2025-03-17T17:58:16.160650091Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 17:58:16.161017 containerd[1527]: time="2025-03-17T17:58:16.160849849Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 17:58:16.161265 containerd[1527]: time="2025-03-17T17:58:16.161241376Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 17:58:16.161871 containerd[1527]: time="2025-03-17T17:58:16.161780611Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162065615Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162107025Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162129336Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162148777Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162167444Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162186917Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162208917Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162230173Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162250946Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162278148Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162296970Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162323974Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162345294Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.167705 containerd[1527]: time="2025-03-17T17:58:16.162365561Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162394589Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162412368Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162432635Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162453819Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162472560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162494106Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162516468Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162535588Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162553151Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162570529Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162619648Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162652669Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162672203Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168218 containerd[1527]: time="2025-03-17T17:58:16.162687538Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162748709Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162778366Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162794938Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162813028Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162826660Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162844904Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162859154Z" level=info msg="NRI interface is disabled by configuration." Mar 17 17:58:16.168688 containerd[1527]: time="2025-03-17T17:58:16.162876222Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 17:58:16.168913 containerd[1527]: time="2025-03-17T17:58:16.163259020Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 17:58:16.168913 containerd[1527]: time="2025-03-17T17:58:16.163324299Z" level=info msg="Connect containerd service" Mar 17 17:58:16.169894 containerd[1527]: time="2025-03-17T17:58:16.169644372Z" level=info msg="using legacy CRI server" Mar 17 17:58:16.169894 containerd[1527]: time="2025-03-17T17:58:16.169866069Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 17:58:16.170367 containerd[1527]: time="2025-03-17T17:58:16.170319800Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.181495986Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.181971758Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.182049318Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.182099923Z" level=info msg="Start subscribing containerd event" Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.182154861Z" level=info msg="Start recovering state" Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.182258775Z" level=info msg="Start event monitor" Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.182284944Z" level=info msg="Start snapshots syncer" Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.182303632Z" level=info msg="Start cni network conf syncer for default" Mar 17 17:58:16.182375 containerd[1527]: time="2025-03-17T17:58:16.182317263Z" level=info msg="Start streaming server" Mar 17 17:58:16.182537 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 17:58:16.187018 containerd[1527]: time="2025-03-17T17:58:16.186712327Z" level=info msg="containerd successfully booted in 0.114425s" Mar 17 17:58:16.698427 tar[1523]: linux-amd64/LICENSE Mar 17 17:58:16.698427 tar[1523]: linux-amd64/README.md Mar 17 17:58:16.729811 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 17 17:58:17.479459 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:17.486688 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 17:58:17.487711 systemd[1]: Startup finished in 1.179s (kernel) + 9.720s (initrd) + 7.999s (userspace) = 18.900s. Mar 17 17:58:17.491715 (kubelet)[1614]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:58:18.152592 kubelet[1614]: E0317 17:58:18.152498 1614 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:58:18.158287 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:58:18.158575 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:58:18.159153 systemd[1]: kubelet.service: Consumed 1.323s CPU time, 244.9M memory peak. Mar 17 17:58:25.374777 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 17:58:25.385908 systemd[1]: Started sshd@0-10.0.0.118:22-10.0.0.1:60234.service - OpenSSH per-connection server daemon (10.0.0.1:60234). Mar 17 17:58:25.438901 sshd[1628]: Accepted publickey for core from 10.0.0.1 port 60234 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:58:25.440969 sshd-session[1628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:58:25.448719 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 17:58:25.465034 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 17:58:25.470890 systemd-logind[1507]: New session 1 of user core. Mar 17 17:58:25.475860 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 17:58:25.479183 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 17:58:25.489488 (systemd)[1632]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 17:58:25.491921 systemd-logind[1507]: New session c1 of user core. Mar 17 17:58:25.639846 systemd[1632]: Queued start job for default target default.target. Mar 17 17:58:25.651912 systemd[1632]: Created slice app.slice - User Application Slice. Mar 17 17:58:25.651939 systemd[1632]: Reached target paths.target - Paths. Mar 17 17:58:25.651979 systemd[1632]: Reached target timers.target - Timers. Mar 17 17:58:25.653630 systemd[1632]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 17:58:25.664868 systemd[1632]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 17:58:25.664994 systemd[1632]: Reached target sockets.target - Sockets. Mar 17 17:58:25.665029 systemd[1632]: Reached target basic.target - Basic System. Mar 17 17:58:25.665072 systemd[1632]: Reached target default.target - Main User Target. Mar 17 17:58:25.665103 systemd[1632]: Startup finished in 165ms. Mar 17 17:58:25.665568 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 17:58:25.667155 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 17:58:25.735349 systemd[1]: Started sshd@1-10.0.0.118:22-10.0.0.1:44004.service - OpenSSH per-connection server daemon (10.0.0.1:44004). Mar 17 17:58:25.807318 sshd[1643]: Accepted publickey for core from 10.0.0.1 port 44004 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:58:25.808901 sshd-session[1643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:58:25.813164 systemd-logind[1507]: New session 2 of user core. Mar 17 17:58:25.820725 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 17:58:25.874553 sshd[1645]: Connection closed by 10.0.0.1 port 44004 Mar 17 17:58:25.875007 sshd-session[1643]: pam_unix(sshd:session): session closed for user core Mar 17 17:58:25.888788 systemd[1]: sshd@1-10.0.0.118:22-10.0.0.1:44004.service: Deactivated successfully. Mar 17 17:58:25.891111 systemd[1]: session-2.scope: Deactivated successfully. Mar 17 17:58:25.892875 systemd-logind[1507]: Session 2 logged out. Waiting for processes to exit. Mar 17 17:58:25.899130 systemd[1]: Started sshd@2-10.0.0.118:22-10.0.0.1:44020.service - OpenSSH per-connection server daemon (10.0.0.1:44020). Mar 17 17:58:25.900477 systemd-logind[1507]: Removed session 2. Mar 17 17:58:25.939119 sshd[1650]: Accepted publickey for core from 10.0.0.1 port 44020 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:58:25.941067 sshd-session[1650]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:58:25.946242 systemd-logind[1507]: New session 3 of user core. Mar 17 17:58:25.955725 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 17:58:26.006905 sshd[1653]: Connection closed by 10.0.0.1 port 44020 Mar 17 17:58:26.007434 sshd-session[1650]: pam_unix(sshd:session): session closed for user core Mar 17 17:58:26.031786 systemd[1]: sshd@2-10.0.0.118:22-10.0.0.1:44020.service: Deactivated successfully. Mar 17 17:58:26.034262 systemd[1]: session-3.scope: Deactivated successfully. Mar 17 17:58:26.036638 systemd-logind[1507]: Session 3 logged out. Waiting for processes to exit. Mar 17 17:58:26.054131 systemd[1]: Started sshd@3-10.0.0.118:22-10.0.0.1:44028.service - OpenSSH per-connection server daemon (10.0.0.1:44028). Mar 17 17:58:26.055411 systemd-logind[1507]: Removed session 3. Mar 17 17:58:26.096132 sshd[1658]: Accepted publickey for core from 10.0.0.1 port 44028 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:58:26.097977 sshd-session[1658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:58:26.102529 systemd-logind[1507]: New session 4 of user core. Mar 17 17:58:26.117739 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 17:58:26.172273 sshd[1661]: Connection closed by 10.0.0.1 port 44028 Mar 17 17:58:26.172910 sshd-session[1658]: pam_unix(sshd:session): session closed for user core Mar 17 17:58:26.185340 systemd[1]: sshd@3-10.0.0.118:22-10.0.0.1:44028.service: Deactivated successfully. Mar 17 17:58:26.187291 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 17:58:26.189159 systemd-logind[1507]: Session 4 logged out. Waiting for processes to exit. Mar 17 17:58:26.197840 systemd[1]: Started sshd@4-10.0.0.118:22-10.0.0.1:44038.service - OpenSSH per-connection server daemon (10.0.0.1:44038). Mar 17 17:58:26.198740 systemd-logind[1507]: Removed session 4. Mar 17 17:58:26.236065 sshd[1666]: Accepted publickey for core from 10.0.0.1 port 44038 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:58:26.237476 sshd-session[1666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:58:26.241686 systemd-logind[1507]: New session 5 of user core. Mar 17 17:58:26.251709 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 17:58:26.309419 sudo[1670]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 17:58:26.309781 sudo[1670]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:58:26.325933 sudo[1670]: pam_unix(sudo:session): session closed for user root Mar 17 17:58:26.327548 sshd[1669]: Connection closed by 10.0.0.1 port 44038 Mar 17 17:58:26.328031 sshd-session[1666]: pam_unix(sshd:session): session closed for user core Mar 17 17:58:26.348882 systemd[1]: sshd@4-10.0.0.118:22-10.0.0.1:44038.service: Deactivated successfully. Mar 17 17:58:26.351013 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 17:58:26.352658 systemd-logind[1507]: Session 5 logged out. Waiting for processes to exit. Mar 17 17:58:26.362888 systemd[1]: Started sshd@5-10.0.0.118:22-10.0.0.1:44048.service - OpenSSH per-connection server daemon (10.0.0.1:44048). Mar 17 17:58:26.364653 systemd-logind[1507]: Removed session 5. Mar 17 17:58:26.400759 sshd[1675]: Accepted publickey for core from 10.0.0.1 port 44048 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:58:26.402180 sshd-session[1675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:58:26.407022 systemd-logind[1507]: New session 6 of user core. Mar 17 17:58:26.420877 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 17:58:26.477573 sudo[1680]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 17:58:26.477991 sudo[1680]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:58:26.482507 sudo[1680]: pam_unix(sudo:session): session closed for user root Mar 17 17:58:26.489232 sudo[1679]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 17:58:26.489569 sudo[1679]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:58:26.506883 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:58:26.539793 augenrules[1702]: No rules Mar 17 17:58:26.541758 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:58:26.542037 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:58:26.543195 sudo[1679]: pam_unix(sudo:session): session closed for user root Mar 17 17:58:26.544761 sshd[1678]: Connection closed by 10.0.0.1 port 44048 Mar 17 17:58:26.545133 sshd-session[1675]: pam_unix(sshd:session): session closed for user core Mar 17 17:58:26.557412 systemd[1]: sshd@5-10.0.0.118:22-10.0.0.1:44048.service: Deactivated successfully. Mar 17 17:58:26.559337 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 17:58:26.561127 systemd-logind[1507]: Session 6 logged out. Waiting for processes to exit. Mar 17 17:58:26.570894 systemd[1]: Started sshd@6-10.0.0.118:22-10.0.0.1:44052.service - OpenSSH per-connection server daemon (10.0.0.1:44052). Mar 17 17:58:26.571811 systemd-logind[1507]: Removed session 6. Mar 17 17:58:26.608368 sshd[1710]: Accepted publickey for core from 10.0.0.1 port 44052 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:58:26.609722 sshd-session[1710]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:58:26.613659 systemd-logind[1507]: New session 7 of user core. Mar 17 17:58:26.623745 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 17:58:26.676884 sudo[1714]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 17:58:26.677224 sudo[1714]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:58:26.982821 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 17 17:58:26.982939 (dockerd)[1733]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 17 17:58:27.235394 dockerd[1733]: time="2025-03-17T17:58:27.235249521Z" level=info msg="Starting up" Mar 17 17:58:28.022341 dockerd[1733]: time="2025-03-17T17:58:28.022262019Z" level=info msg="Loading containers: start." Mar 17 17:58:28.409309 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 17:58:28.419731 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:28.567895 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:28.572251 (kubelet)[1828]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:58:29.003736 kubelet[1828]: E0317 17:58:29.003683 1828 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:58:29.010991 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:58:29.011203 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:58:29.011607 systemd[1]: kubelet.service: Consumed 205ms CPU time, 99.6M memory peak. Mar 17 17:58:29.054608 kernel: Initializing XFRM netlink socket Mar 17 17:58:29.137756 systemd-networkd[1459]: docker0: Link UP Mar 17 17:58:29.180017 dockerd[1733]: time="2025-03-17T17:58:29.179965475Z" level=info msg="Loading containers: done." Mar 17 17:58:29.193969 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1491282791-merged.mount: Deactivated successfully. Mar 17 17:58:29.196724 dockerd[1733]: time="2025-03-17T17:58:29.196677593Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 17:58:29.196803 dockerd[1733]: time="2025-03-17T17:58:29.196776245Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Mar 17 17:58:29.196922 dockerd[1733]: time="2025-03-17T17:58:29.196898092Z" level=info msg="Daemon has completed initialization" Mar 17 17:58:29.231902 dockerd[1733]: time="2025-03-17T17:58:29.231788395Z" level=info msg="API listen on /run/docker.sock" Mar 17 17:58:29.231993 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 17 17:58:30.016337 containerd[1527]: time="2025-03-17T17:58:30.016056214Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 17:58:30.655510 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3109102412.mount: Deactivated successfully. Mar 17 17:58:31.666338 containerd[1527]: time="2025-03-17T17:58:31.666274743Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:31.667041 containerd[1527]: time="2025-03-17T17:58:31.666967830Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674573" Mar 17 17:58:31.668136 containerd[1527]: time="2025-03-17T17:58:31.668106785Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:31.670949 containerd[1527]: time="2025-03-17T17:58:31.670917145Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:31.672044 containerd[1527]: time="2025-03-17T17:58:31.671996398Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 1.655894545s" Mar 17 17:58:31.672091 containerd[1527]: time="2025-03-17T17:58:31.672044995Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 17 17:58:31.693722 containerd[1527]: time="2025-03-17T17:58:31.693674029Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 17:58:33.610894 containerd[1527]: time="2025-03-17T17:58:33.610832688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:33.611806 containerd[1527]: time="2025-03-17T17:58:33.611757783Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619772" Mar 17 17:58:33.613171 containerd[1527]: time="2025-03-17T17:58:33.613092478Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:33.616734 containerd[1527]: time="2025-03-17T17:58:33.616666904Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:33.617876 containerd[1527]: time="2025-03-17T17:58:33.617818130Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 1.924103353s" Mar 17 17:58:33.617876 containerd[1527]: time="2025-03-17T17:58:33.617865374Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 17 17:58:33.642783 containerd[1527]: time="2025-03-17T17:58:33.642743603Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 17:58:34.739638 containerd[1527]: time="2025-03-17T17:58:34.739545924Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:34.740720 containerd[1527]: time="2025-03-17T17:58:34.740673991Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903309" Mar 17 17:58:34.742846 containerd[1527]: time="2025-03-17T17:58:34.742815684Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:34.745714 containerd[1527]: time="2025-03-17T17:58:34.745667654Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:34.746645 containerd[1527]: time="2025-03-17T17:58:34.746594824Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 1.10379855s" Mar 17 17:58:34.746688 containerd[1527]: time="2025-03-17T17:58:34.746644430Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 17 17:58:34.769320 containerd[1527]: time="2025-03-17T17:58:34.769278481Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 17:58:36.497290 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2541382635.mount: Deactivated successfully. Mar 17 17:58:36.806348 containerd[1527]: time="2025-03-17T17:58:36.806146384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:36.807731 containerd[1527]: time="2025-03-17T17:58:36.807672760Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185372" Mar 17 17:58:36.809957 containerd[1527]: time="2025-03-17T17:58:36.809878714Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:36.818913 containerd[1527]: time="2025-03-17T17:58:36.816629110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:36.818913 containerd[1527]: time="2025-03-17T17:58:36.817560080Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 2.048116984s" Mar 17 17:58:36.818913 containerd[1527]: time="2025-03-17T17:58:36.817633208Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 17 17:58:36.867942 containerd[1527]: time="2025-03-17T17:58:36.867620560Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 17:58:37.465411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1599129133.mount: Deactivated successfully. Mar 17 17:58:38.875521 containerd[1527]: time="2025-03-17T17:58:38.875454106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:38.876178 containerd[1527]: time="2025-03-17T17:58:38.876147784Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Mar 17 17:58:38.877219 containerd[1527]: time="2025-03-17T17:58:38.877191663Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:38.880232 containerd[1527]: time="2025-03-17T17:58:38.880187643Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:38.881108 containerd[1527]: time="2025-03-17T17:58:38.881076797Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.013407398s" Mar 17 17:58:38.881156 containerd[1527]: time="2025-03-17T17:58:38.881106099Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 17 17:58:38.903989 containerd[1527]: time="2025-03-17T17:58:38.903941291Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 17:58:39.261706 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 17:58:39.270831 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:39.428592 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:39.434109 (kubelet)[2109]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:58:39.580146 kubelet[2109]: E0317 17:58:39.579997 2109 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:58:39.584960 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:58:39.585213 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:58:39.585711 systemd[1]: kubelet.service: Consumed 207ms CPU time, 97.1M memory peak. Mar 17 17:58:39.657529 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1229480480.mount: Deactivated successfully. Mar 17 17:58:39.662887 containerd[1527]: time="2025-03-17T17:58:39.662852110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:39.663718 containerd[1527]: time="2025-03-17T17:58:39.663658729Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Mar 17 17:58:39.664881 containerd[1527]: time="2025-03-17T17:58:39.664855923Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:39.667001 containerd[1527]: time="2025-03-17T17:58:39.666965671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:39.667775 containerd[1527]: time="2025-03-17T17:58:39.667748323Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 763.769394ms" Mar 17 17:58:39.667827 containerd[1527]: time="2025-03-17T17:58:39.667773545Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 17 17:58:39.689543 containerd[1527]: time="2025-03-17T17:58:39.689501866Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 17:58:40.182124 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1150663290.mount: Deactivated successfully. Mar 17 17:58:42.367186 containerd[1527]: time="2025-03-17T17:58:42.367121229Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:42.368264 containerd[1527]: time="2025-03-17T17:58:42.368187427Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238571" Mar 17 17:58:42.369895 containerd[1527]: time="2025-03-17T17:58:42.369837173Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:42.375616 containerd[1527]: time="2025-03-17T17:58:42.375497541Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:58:42.376847 containerd[1527]: time="2025-03-17T17:58:42.376776105Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 2.687232992s" Mar 17 17:58:42.376847 containerd[1527]: time="2025-03-17T17:58:42.376838601Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 17 17:58:44.464636 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:44.464802 systemd[1]: kubelet.service: Consumed 207ms CPU time, 97.1M memory peak. Mar 17 17:58:44.476860 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:44.496700 systemd[1]: Reload requested from client PID 2256 ('systemctl') (unit session-7.scope)... Mar 17 17:58:44.496715 systemd[1]: Reloading... Mar 17 17:58:44.584607 zram_generator::config[2303]: No configuration found. Mar 17 17:58:44.918674 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:58:45.022764 systemd[1]: Reloading finished in 525 ms. Mar 17 17:58:45.085100 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:45.089100 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:45.089721 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:58:45.089989 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:45.090026 systemd[1]: kubelet.service: Consumed 129ms CPU time, 83.6M memory peak. Mar 17 17:58:45.091648 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:45.248520 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:45.252668 (kubelet)[2350]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:58:45.295478 kubelet[2350]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:58:45.295478 kubelet[2350]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:58:45.295478 kubelet[2350]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:58:45.295901 kubelet[2350]: I0317 17:58:45.295507 2350 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:58:45.733534 kubelet[2350]: I0317 17:58:45.733424 2350 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:58:45.733534 kubelet[2350]: I0317 17:58:45.733455 2350 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:58:45.733705 kubelet[2350]: I0317 17:58:45.733686 2350 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:58:45.747847 kubelet[2350]: I0317 17:58:45.747824 2350 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:58:45.748599 kubelet[2350]: E0317 17:58:45.748345 2350 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.118:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.761409 kubelet[2350]: I0317 17:58:45.761379 2350 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:58:45.763053 kubelet[2350]: I0317 17:58:45.763014 2350 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:58:45.763202 kubelet[2350]: I0317 17:58:45.763045 2350 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:58:45.763288 kubelet[2350]: I0317 17:58:45.763207 2350 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:58:45.763288 kubelet[2350]: I0317 17:58:45.763217 2350 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:58:45.763917 kubelet[2350]: I0317 17:58:45.763896 2350 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:58:45.764494 kubelet[2350]: I0317 17:58:45.764472 2350 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:58:45.764494 kubelet[2350]: I0317 17:58:45.764490 2350 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:58:45.764539 kubelet[2350]: I0317 17:58:45.764511 2350 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:58:45.764539 kubelet[2350]: I0317 17:58:45.764520 2350 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:58:45.765017 kubelet[2350]: W0317 17:58:45.764943 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.118:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.765017 kubelet[2350]: E0317 17:58:45.764996 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.118:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.766812 kubelet[2350]: W0317 17:58:45.766783 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.766883 kubelet[2350]: E0317 17:58:45.766863 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.769047 kubelet[2350]: I0317 17:58:45.769030 2350 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:58:45.770487 kubelet[2350]: I0317 17:58:45.770468 2350 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:58:45.770540 kubelet[2350]: W0317 17:58:45.770526 2350 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 17:58:45.771204 kubelet[2350]: I0317 17:58:45.771191 2350 server.go:1264] "Started kubelet" Mar 17 17:58:45.771329 kubelet[2350]: I0317 17:58:45.771295 2350 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:58:45.772408 kubelet[2350]: I0317 17:58:45.771996 2350 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:58:45.772408 kubelet[2350]: I0317 17:58:45.772037 2350 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:58:45.773127 kubelet[2350]: I0317 17:58:45.772876 2350 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:58:45.773821 kubelet[2350]: I0317 17:58:45.773793 2350 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:58:45.775400 kubelet[2350]: E0317 17:58:45.775379 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:45.775464 kubelet[2350]: I0317 17:58:45.775414 2350 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:58:45.775503 kubelet[2350]: I0317 17:58:45.775482 2350 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:58:45.775537 kubelet[2350]: I0317 17:58:45.775530 2350 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:58:45.775752 kubelet[2350]: E0317 17:58:45.775587 2350 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="200ms" Mar 17 17:58:45.775752 kubelet[2350]: W0317 17:58:45.775736 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.775844 kubelet[2350]: E0317 17:58:45.775767 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.775844 kubelet[2350]: E0317 17:58:45.775742 2350 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.118:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.118:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182da8e377153a0a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-17 17:58:45.771172362 +0000 UTC m=+0.515071204,LastTimestamp:2025-03-17 17:58:45.771172362 +0000 UTC m=+0.515071204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 17 17:58:45.776144 kubelet[2350]: I0317 17:58:45.776037 2350 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:58:45.776144 kubelet[2350]: E0317 17:58:45.776058 2350 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:58:45.776144 kubelet[2350]: I0317 17:58:45.776102 2350 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:58:45.777610 kubelet[2350]: I0317 17:58:45.777148 2350 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:58:45.792181 kubelet[2350]: I0317 17:58:45.792153 2350 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:58:45.792181 kubelet[2350]: I0317 17:58:45.792173 2350 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:58:45.792297 kubelet[2350]: I0317 17:58:45.792201 2350 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:58:45.794798 kubelet[2350]: I0317 17:58:45.794773 2350 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:58:45.796013 kubelet[2350]: I0317 17:58:45.795990 2350 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:58:45.796064 kubelet[2350]: I0317 17:58:45.796024 2350 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:58:45.796064 kubelet[2350]: I0317 17:58:45.796039 2350 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:58:45.796109 kubelet[2350]: E0317 17:58:45.796084 2350 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:58:45.796500 kubelet[2350]: W0317 17:58:45.796463 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.796526 kubelet[2350]: E0317 17:58:45.796504 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:45.877309 kubelet[2350]: I0317 17:58:45.877275 2350 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:58:45.877590 kubelet[2350]: E0317 17:58:45.877563 2350 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" Mar 17 17:58:45.896727 kubelet[2350]: E0317 17:58:45.896686 2350 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 17:58:45.976259 kubelet[2350]: E0317 17:58:45.976196 2350 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="400ms" Mar 17 17:58:45.992867 kubelet[2350]: E0317 17:58:45.992701 2350 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.118:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.118:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182da8e377153a0a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-17 17:58:45.771172362 +0000 UTC m=+0.515071204,LastTimestamp:2025-03-17 17:58:45.771172362 +0000 UTC m=+0.515071204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 17 17:58:46.079154 kubelet[2350]: I0317 17:58:46.079113 2350 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:58:46.079494 kubelet[2350]: E0317 17:58:46.079446 2350 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" Mar 17 17:58:46.097559 kubelet[2350]: E0317 17:58:46.097515 2350 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 17:58:46.194058 kubelet[2350]: I0317 17:58:46.194009 2350 policy_none.go:49] "None policy: Start" Mar 17 17:58:46.194782 kubelet[2350]: I0317 17:58:46.194742 2350 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:58:46.194782 kubelet[2350]: I0317 17:58:46.194769 2350 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:58:46.201642 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 17 17:58:46.220461 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 17 17:58:46.223713 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 17 17:58:46.234570 kubelet[2350]: I0317 17:58:46.234520 2350 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:58:46.234834 kubelet[2350]: I0317 17:58:46.234772 2350 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:58:46.234930 kubelet[2350]: I0317 17:58:46.234890 2350 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:58:46.235855 kubelet[2350]: E0317 17:58:46.235831 2350 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 17 17:58:46.377148 kubelet[2350]: E0317 17:58:46.377005 2350 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="800ms" Mar 17 17:58:46.480847 kubelet[2350]: I0317 17:58:46.480557 2350 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:58:46.480950 kubelet[2350]: E0317 17:58:46.480916 2350 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" Mar 17 17:58:46.498061 kubelet[2350]: I0317 17:58:46.498015 2350 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 17:58:46.498854 kubelet[2350]: I0317 17:58:46.498764 2350 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 17:58:46.499743 kubelet[2350]: I0317 17:58:46.499725 2350 topology_manager.go:215] "Topology Admit Handler" podUID="567dbeb7f2fb0db97af3c4d7d3e73a7c" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 17:58:46.504986 systemd[1]: Created slice kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice - libcontainer container kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice. Mar 17 17:58:46.516875 systemd[1]: Created slice kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice - libcontainer container kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice. Mar 17 17:58:46.520353 systemd[1]: Created slice kubepods-burstable-pod567dbeb7f2fb0db97af3c4d7d3e73a7c.slice - libcontainer container kubepods-burstable-pod567dbeb7f2fb0db97af3c4d7d3e73a7c.slice. Mar 17 17:58:46.580466 kubelet[2350]: I0317 17:58:46.580417 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/567dbeb7f2fb0db97af3c4d7d3e73a7c-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"567dbeb7f2fb0db97af3c4d7d3e73a7c\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:46.580466 kubelet[2350]: I0317 17:58:46.580462 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/567dbeb7f2fb0db97af3c4d7d3e73a7c-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"567dbeb7f2fb0db97af3c4d7d3e73a7c\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:46.580613 kubelet[2350]: I0317 17:58:46.580509 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:46.580613 kubelet[2350]: I0317 17:58:46.580530 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:46.580613 kubelet[2350]: I0317 17:58:46.580556 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:58:46.580613 kubelet[2350]: I0317 17:58:46.580570 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/567dbeb7f2fb0db97af3c4d7d3e73a7c-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"567dbeb7f2fb0db97af3c4d7d3e73a7c\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:46.580745 kubelet[2350]: I0317 17:58:46.580621 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:46.580745 kubelet[2350]: I0317 17:58:46.580666 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:46.580745 kubelet[2350]: I0317 17:58:46.580702 2350 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:46.586907 kubelet[2350]: W0317 17:58:46.586863 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.118:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:46.586907 kubelet[2350]: E0317 17:58:46.586909 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.118:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:46.622498 kubelet[2350]: W0317 17:58:46.622417 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:46.622498 kubelet[2350]: E0317 17:58:46.622496 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.118:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:46.661998 kubelet[2350]: W0317 17:58:46.661874 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:46.661998 kubelet[2350]: E0317 17:58:46.661927 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.118:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:46.814769 kubelet[2350]: E0317 17:58:46.814732 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:46.815230 containerd[1527]: time="2025-03-17T17:58:46.815194801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,}" Mar 17 17:58:46.819345 kubelet[2350]: E0317 17:58:46.819331 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:46.819587 containerd[1527]: time="2025-03-17T17:58:46.819551245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,}" Mar 17 17:58:46.822874 kubelet[2350]: E0317 17:58:46.822848 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:46.823301 containerd[1527]: time="2025-03-17T17:58:46.823271865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:567dbeb7f2fb0db97af3c4d7d3e73a7c,Namespace:kube-system,Attempt:0,}" Mar 17 17:58:47.144354 kubelet[2350]: W0317 17:58:47.144252 2350 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:47.144354 kubelet[2350]: E0317 17:58:47.144292 2350 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.118:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:47.177878 kubelet[2350]: E0317 17:58:47.177822 2350 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.118:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.118:6443: connect: connection refused" interval="1.6s" Mar 17 17:58:47.282664 kubelet[2350]: I0317 17:58:47.282622 2350 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:58:47.283062 kubelet[2350]: E0317 17:58:47.283006 2350 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.118:6443/api/v1/nodes\": dial tcp 10.0.0.118:6443: connect: connection refused" node="localhost" Mar 17 17:58:47.497375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2775402253.mount: Deactivated successfully. Mar 17 17:58:47.503107 containerd[1527]: time="2025-03-17T17:58:47.503068299Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:58:47.506016 containerd[1527]: time="2025-03-17T17:58:47.505910096Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Mar 17 17:58:47.506863 containerd[1527]: time="2025-03-17T17:58:47.506835991Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:58:47.508850 containerd[1527]: time="2025-03-17T17:58:47.508818151Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:58:47.509548 containerd[1527]: time="2025-03-17T17:58:47.509496502Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:58:47.510507 containerd[1527]: time="2025-03-17T17:58:47.510462729Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:58:47.511208 containerd[1527]: time="2025-03-17T17:58:47.511167968Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:58:47.513074 containerd[1527]: time="2025-03-17T17:58:47.513041071Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:58:47.514830 containerd[1527]: time="2025-03-17T17:58:47.514803056Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 691.465969ms" Mar 17 17:58:47.515393 containerd[1527]: time="2025-03-17T17:58:47.515362366Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 700.072307ms" Mar 17 17:58:47.517945 containerd[1527]: time="2025-03-17T17:58:47.517914563Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 698.296552ms" Mar 17 17:58:47.672394 containerd[1527]: time="2025-03-17T17:58:47.672264967Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:58:47.672394 containerd[1527]: time="2025-03-17T17:58:47.672322101Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:58:47.672394 containerd[1527]: time="2025-03-17T17:58:47.672332357Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:58:47.672686 containerd[1527]: time="2025-03-17T17:58:47.672400068Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:58:47.677348 containerd[1527]: time="2025-03-17T17:58:47.677061344Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:58:47.677348 containerd[1527]: time="2025-03-17T17:58:47.677118739Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:58:47.677348 containerd[1527]: time="2025-03-17T17:58:47.677136724Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:58:47.677348 containerd[1527]: time="2025-03-17T17:58:47.677220315Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:58:47.681635 containerd[1527]: time="2025-03-17T17:58:47.681405741Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:58:47.681635 containerd[1527]: time="2025-03-17T17:58:47.681450283Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:58:47.681635 containerd[1527]: time="2025-03-17T17:58:47.681479036Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:58:47.681978 containerd[1527]: time="2025-03-17T17:58:47.681572180Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:58:47.700719 systemd[1]: Started cri-containerd-ae209e330f5678200dcf2ea642995fb7e0bea8e255440be0e9a2ae3639a71a52.scope - libcontainer container ae209e330f5678200dcf2ea642995fb7e0bea8e255440be0e9a2ae3639a71a52. Mar 17 17:58:47.706419 systemd[1]: Started cri-containerd-040c674bfc1b4d5c43d96c6482122921c02d10aa53dacb142b7c8533f831f7bd.scope - libcontainer container 040c674bfc1b4d5c43d96c6482122921c02d10aa53dacb142b7c8533f831f7bd. Mar 17 17:58:47.709613 systemd[1]: Started cri-containerd-087dd818d4eb921fbe2dbd88a9f4a2fc3d54bed8f00cc8b86b910eb7357b7648.scope - libcontainer container 087dd818d4eb921fbe2dbd88a9f4a2fc3d54bed8f00cc8b86b910eb7357b7648. Mar 17 17:58:47.745503 containerd[1527]: time="2025-03-17T17:58:47.745467373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:567dbeb7f2fb0db97af3c4d7d3e73a7c,Namespace:kube-system,Attempt:0,} returns sandbox id \"ae209e330f5678200dcf2ea642995fb7e0bea8e255440be0e9a2ae3639a71a52\"" Mar 17 17:58:47.746874 kubelet[2350]: E0317 17:58:47.746644 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:47.750282 containerd[1527]: time="2025-03-17T17:58:47.750190265Z" level=info msg="CreateContainer within sandbox \"ae209e330f5678200dcf2ea642995fb7e0bea8e255440be0e9a2ae3639a71a52\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 17:58:47.756435 containerd[1527]: time="2025-03-17T17:58:47.755723811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"040c674bfc1b4d5c43d96c6482122921c02d10aa53dacb142b7c8533f831f7bd\"" Mar 17 17:58:47.756519 kubelet[2350]: E0317 17:58:47.756278 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:47.758079 kubelet[2350]: E0317 17:58:47.757690 2350 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.118:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.118:6443: connect: connection refused Mar 17 17:58:47.759176 containerd[1527]: time="2025-03-17T17:58:47.759065931Z" level=info msg="CreateContainer within sandbox \"040c674bfc1b4d5c43d96c6482122921c02d10aa53dacb142b7c8533f831f7bd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 17:58:47.759230 containerd[1527]: time="2025-03-17T17:58:47.759207849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,} returns sandbox id \"087dd818d4eb921fbe2dbd88a9f4a2fc3d54bed8f00cc8b86b910eb7357b7648\"" Mar 17 17:58:47.759760 kubelet[2350]: E0317 17:58:47.759738 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:47.761493 containerd[1527]: time="2025-03-17T17:58:47.761474877Z" level=info msg="CreateContainer within sandbox \"087dd818d4eb921fbe2dbd88a9f4a2fc3d54bed8f00cc8b86b910eb7357b7648\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 17:58:47.774075 containerd[1527]: time="2025-03-17T17:58:47.774028869Z" level=info msg="CreateContainer within sandbox \"ae209e330f5678200dcf2ea642995fb7e0bea8e255440be0e9a2ae3639a71a52\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8b08f38d58ed44b7cc1ece14ad90690ad98bb05f9a1462fcfabc54492c9a9b60\"" Mar 17 17:58:47.774519 containerd[1527]: time="2025-03-17T17:58:47.774483927Z" level=info msg="StartContainer for \"8b08f38d58ed44b7cc1ece14ad90690ad98bb05f9a1462fcfabc54492c9a9b60\"" Mar 17 17:58:47.786394 containerd[1527]: time="2025-03-17T17:58:47.786279297Z" level=info msg="CreateContainer within sandbox \"040c674bfc1b4d5c43d96c6482122921c02d10aa53dacb142b7c8533f831f7bd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"eb1473e375df8ced15e769cd65887cd7a75a357208ad7b72ef206c729935a081\"" Mar 17 17:58:47.787307 containerd[1527]: time="2025-03-17T17:58:47.787284121Z" level=info msg="StartContainer for \"eb1473e375df8ced15e769cd65887cd7a75a357208ad7b72ef206c729935a081\"" Mar 17 17:58:47.795333 containerd[1527]: time="2025-03-17T17:58:47.795255345Z" level=info msg="CreateContainer within sandbox \"087dd818d4eb921fbe2dbd88a9f4a2fc3d54bed8f00cc8b86b910eb7357b7648\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5c4767b0df1e007e0982eb4594cba2cf82f25bd49e6f8466f370a717929674d2\"" Mar 17 17:58:47.796614 containerd[1527]: time="2025-03-17T17:58:47.795708388Z" level=info msg="StartContainer for \"5c4767b0df1e007e0982eb4594cba2cf82f25bd49e6f8466f370a717929674d2\"" Mar 17 17:58:47.800745 systemd[1]: Started cri-containerd-8b08f38d58ed44b7cc1ece14ad90690ad98bb05f9a1462fcfabc54492c9a9b60.scope - libcontainer container 8b08f38d58ed44b7cc1ece14ad90690ad98bb05f9a1462fcfabc54492c9a9b60. Mar 17 17:58:47.816846 systemd[1]: Started cri-containerd-eb1473e375df8ced15e769cd65887cd7a75a357208ad7b72ef206c729935a081.scope - libcontainer container eb1473e375df8ced15e769cd65887cd7a75a357208ad7b72ef206c729935a081. Mar 17 17:58:47.822711 systemd[1]: Started cri-containerd-5c4767b0df1e007e0982eb4594cba2cf82f25bd49e6f8466f370a717929674d2.scope - libcontainer container 5c4767b0df1e007e0982eb4594cba2cf82f25bd49e6f8466f370a717929674d2. Mar 17 17:58:47.868335 containerd[1527]: time="2025-03-17T17:58:47.868277848Z" level=info msg="StartContainer for \"eb1473e375df8ced15e769cd65887cd7a75a357208ad7b72ef206c729935a081\" returns successfully" Mar 17 17:58:47.868727 containerd[1527]: time="2025-03-17T17:58:47.868378211Z" level=info msg="StartContainer for \"8b08f38d58ed44b7cc1ece14ad90690ad98bb05f9a1462fcfabc54492c9a9b60\" returns successfully" Mar 17 17:58:47.875854 containerd[1527]: time="2025-03-17T17:58:47.875807048Z" level=info msg="StartContainer for \"5c4767b0df1e007e0982eb4594cba2cf82f25bd49e6f8466f370a717929674d2\" returns successfully" Mar 17 17:58:48.780450 kubelet[2350]: E0317 17:58:48.780387 2350 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 17 17:58:48.817060 kubelet[2350]: E0317 17:58:48.817023 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:48.818841 kubelet[2350]: E0317 17:58:48.818780 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:48.820415 kubelet[2350]: E0317 17:58:48.820390 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:48.884562 kubelet[2350]: I0317 17:58:48.884523 2350 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:58:48.891771 kubelet[2350]: I0317 17:58:48.891738 2350 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 17:58:48.898172 kubelet[2350]: E0317 17:58:48.898130 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:48.998682 kubelet[2350]: E0317 17:58:48.998641 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:49.099905 kubelet[2350]: E0317 17:58:49.099752 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:49.199967 kubelet[2350]: E0317 17:58:49.199899 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:49.300085 kubelet[2350]: E0317 17:58:49.300037 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:49.400721 kubelet[2350]: E0317 17:58:49.400598 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:49.501202 kubelet[2350]: E0317 17:58:49.501157 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:49.601809 kubelet[2350]: E0317 17:58:49.601766 2350 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:58:49.766614 kubelet[2350]: I0317 17:58:49.766441 2350 apiserver.go:52] "Watching apiserver" Mar 17 17:58:49.776194 kubelet[2350]: I0317 17:58:49.776164 2350 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:58:49.828855 kubelet[2350]: E0317 17:58:49.828813 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:49.830671 kubelet[2350]: E0317 17:58:49.830647 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:49.831198 kubelet[2350]: E0317 17:58:49.831167 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:50.682246 systemd[1]: Reload requested from client PID 2632 ('systemctl') (unit session-7.scope)... Mar 17 17:58:50.682262 systemd[1]: Reloading... Mar 17 17:58:50.771616 zram_generator::config[2683]: No configuration found. Mar 17 17:58:50.825600 kubelet[2350]: E0317 17:58:50.822823 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:50.825600 kubelet[2350]: E0317 17:58:50.822957 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:50.825600 kubelet[2350]: E0317 17:58:50.823396 2350 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:50.873603 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:58:50.988992 systemd[1]: Reloading finished in 306 ms. Mar 17 17:58:51.011141 kubelet[2350]: I0317 17:58:51.011110 2350 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:58:51.011362 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:51.030325 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:58:51.030699 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:51.030784 systemd[1]: kubelet.service: Consumed 945ms CPU time, 118.9M memory peak. Mar 17 17:58:51.042933 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:58:51.201266 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:58:51.205410 (kubelet)[2721]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:58:51.246773 kubelet[2721]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:58:51.246773 kubelet[2721]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:58:51.246773 kubelet[2721]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:58:51.246773 kubelet[2721]: I0317 17:58:51.246740 2721 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:58:51.251040 kubelet[2721]: I0317 17:58:51.251002 2721 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:58:51.251040 kubelet[2721]: I0317 17:58:51.251028 2721 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:58:51.251242 kubelet[2721]: I0317 17:58:51.251226 2721 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:58:51.252390 kubelet[2721]: I0317 17:58:51.252371 2721 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 17:58:51.253503 kubelet[2721]: I0317 17:58:51.253462 2721 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:58:51.263008 kubelet[2721]: I0317 17:58:51.262961 2721 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:58:51.263284 kubelet[2721]: I0317 17:58:51.263245 2721 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:58:51.263422 kubelet[2721]: I0317 17:58:51.263277 2721 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:58:51.263495 kubelet[2721]: I0317 17:58:51.263437 2721 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:58:51.263495 kubelet[2721]: I0317 17:58:51.263446 2721 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:58:51.263495 kubelet[2721]: I0317 17:58:51.263486 2721 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:58:51.263679 kubelet[2721]: I0317 17:58:51.263592 2721 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:58:51.263679 kubelet[2721]: I0317 17:58:51.263642 2721 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:58:51.263679 kubelet[2721]: I0317 17:58:51.263668 2721 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:58:51.263763 kubelet[2721]: I0317 17:58:51.263682 2721 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:58:51.267935 kubelet[2721]: I0317 17:58:51.267919 2721 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:58:51.268144 kubelet[2721]: I0317 17:58:51.268130 2721 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:58:51.269592 kubelet[2721]: I0317 17:58:51.268461 2721 server.go:1264] "Started kubelet" Mar 17 17:58:51.269592 kubelet[2721]: I0317 17:58:51.268712 2721 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:58:51.269592 kubelet[2721]: I0317 17:58:51.268755 2721 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:58:51.269592 kubelet[2721]: I0317 17:58:51.269037 2721 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:58:51.269747 kubelet[2721]: I0317 17:58:51.269738 2721 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:58:51.270342 kubelet[2721]: I0317 17:58:51.270206 2721 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:58:51.272505 kubelet[2721]: I0317 17:58:51.272474 2721 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:58:51.272622 kubelet[2721]: I0317 17:58:51.272602 2721 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:58:51.273643 kubelet[2721]: I0317 17:58:51.272785 2721 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:58:51.276012 kubelet[2721]: I0317 17:58:51.275966 2721 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:58:51.276121 kubelet[2721]: I0317 17:58:51.276094 2721 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:58:51.277172 kubelet[2721]: I0317 17:58:51.277155 2721 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:58:51.277624 kubelet[2721]: E0317 17:58:51.277602 2721 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:58:51.283284 kubelet[2721]: I0317 17:58:51.283245 2721 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:58:51.285147 kubelet[2721]: I0317 17:58:51.285110 2721 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:58:51.285147 kubelet[2721]: I0317 17:58:51.285150 2721 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:58:51.285227 kubelet[2721]: I0317 17:58:51.285162 2721 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:58:51.285227 kubelet[2721]: E0317 17:58:51.285200 2721 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:58:51.312996 kubelet[2721]: I0317 17:58:51.312959 2721 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:58:51.312996 kubelet[2721]: I0317 17:58:51.312974 2721 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:58:51.312996 kubelet[2721]: I0317 17:58:51.312992 2721 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:58:51.313187 kubelet[2721]: I0317 17:58:51.313151 2721 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 17:58:51.313187 kubelet[2721]: I0317 17:58:51.313163 2721 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 17:58:51.313187 kubelet[2721]: I0317 17:58:51.313185 2721 policy_none.go:49] "None policy: Start" Mar 17 17:58:51.313635 kubelet[2721]: I0317 17:58:51.313617 2721 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:58:51.313694 kubelet[2721]: I0317 17:58:51.313640 2721 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:58:51.313792 kubelet[2721]: I0317 17:58:51.313771 2721 state_mem.go:75] "Updated machine memory state" Mar 17 17:58:51.318542 kubelet[2721]: I0317 17:58:51.318517 2721 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:58:51.318962 kubelet[2721]: I0317 17:58:51.318916 2721 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:58:51.319299 kubelet[2721]: I0317 17:58:51.319051 2721 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:58:51.377801 kubelet[2721]: I0317 17:58:51.377771 2721 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:58:51.383767 kubelet[2721]: I0317 17:58:51.383735 2721 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Mar 17 17:58:51.383861 kubelet[2721]: I0317 17:58:51.383827 2721 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 17:58:51.385337 kubelet[2721]: I0317 17:58:51.385295 2721 topology_manager.go:215] "Topology Admit Handler" podUID="567dbeb7f2fb0db97af3c4d7d3e73a7c" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 17:58:51.385479 kubelet[2721]: I0317 17:58:51.385392 2721 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 17:58:51.385479 kubelet[2721]: I0317 17:58:51.385465 2721 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 17:58:51.391532 kubelet[2721]: E0317 17:58:51.391156 2721 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:51.391532 kubelet[2721]: E0317 17:58:51.391440 2721 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:51.391889 kubelet[2721]: E0317 17:58:51.391853 2721 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 17 17:58:51.474251 kubelet[2721]: I0317 17:58:51.474212 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:51.474395 kubelet[2721]: I0317 17:58:51.474257 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:58:51.474395 kubelet[2721]: I0317 17:58:51.474279 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:51.474395 kubelet[2721]: I0317 17:58:51.474295 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:51.474395 kubelet[2721]: I0317 17:58:51.474316 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:51.474395 kubelet[2721]: I0317 17:58:51.474334 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/567dbeb7f2fb0db97af3c4d7d3e73a7c-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"567dbeb7f2fb0db97af3c4d7d3e73a7c\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:51.474531 kubelet[2721]: I0317 17:58:51.474353 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/567dbeb7f2fb0db97af3c4d7d3e73a7c-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"567dbeb7f2fb0db97af3c4d7d3e73a7c\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:51.474531 kubelet[2721]: I0317 17:58:51.474370 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/567dbeb7f2fb0db97af3c4d7d3e73a7c-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"567dbeb7f2fb0db97af3c4d7d3e73a7c\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:51.474531 kubelet[2721]: I0317 17:58:51.474386 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:51.692607 kubelet[2721]: E0317 17:58:51.692381 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:51.692607 kubelet[2721]: E0317 17:58:51.692422 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:51.692607 kubelet[2721]: E0317 17:58:51.692499 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:52.265907 kubelet[2721]: I0317 17:58:52.265864 2721 apiserver.go:52] "Watching apiserver" Mar 17 17:58:52.272898 kubelet[2721]: I0317 17:58:52.272835 2721 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:58:52.296111 kubelet[2721]: E0317 17:58:52.296078 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:52.304876 kubelet[2721]: E0317 17:58:52.304830 2721 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 17 17:58:52.305227 kubelet[2721]: E0317 17:58:52.305197 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:52.305362 kubelet[2721]: E0317 17:58:52.305339 2721 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 17 17:58:52.308821 kubelet[2721]: E0317 17:58:52.308773 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:52.473552 kubelet[2721]: I0317 17:58:52.473125 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.473100367 podStartE2EDuration="3.473100367s" podCreationTimestamp="2025-03-17 17:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:58:52.472966261 +0000 UTC m=+1.263809527" watchObservedRunningTime="2025-03-17 17:58:52.473100367 +0000 UTC m=+1.263943633" Mar 17 17:58:52.473552 kubelet[2721]: I0317 17:58:52.473259 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.473255183 podStartE2EDuration="3.473255183s" podCreationTimestamp="2025-03-17 17:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:58:52.339623577 +0000 UTC m=+1.130466843" watchObservedRunningTime="2025-03-17 17:58:52.473255183 +0000 UTC m=+1.264098449" Mar 17 17:58:52.540782 kubelet[2721]: I0317 17:58:52.540646 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.540628965 podStartE2EDuration="3.540628965s" podCreationTimestamp="2025-03-17 17:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:58:52.540464617 +0000 UTC m=+1.331307883" watchObservedRunningTime="2025-03-17 17:58:52.540628965 +0000 UTC m=+1.331472231" Mar 17 17:58:53.297487 kubelet[2721]: E0317 17:58:53.297440 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:53.297929 kubelet[2721]: E0317 17:58:53.297548 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:55.814093 kubelet[2721]: E0317 17:58:55.814051 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:57.683378 kubelet[2721]: E0317 17:58:57.683340 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:58.305951 kubelet[2721]: E0317 17:58:58.305910 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:58:58.567932 sudo[1714]: pam_unix(sudo:session): session closed for user root Mar 17 17:58:58.569950 sshd[1713]: Connection closed by 10.0.0.1 port 44052 Mar 17 17:58:58.570512 sshd-session[1710]: pam_unix(sshd:session): session closed for user core Mar 17 17:58:58.575147 systemd[1]: sshd@6-10.0.0.118:22-10.0.0.1:44052.service: Deactivated successfully. Mar 17 17:58:58.577751 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 17:58:58.577958 systemd[1]: session-7.scope: Consumed 4.170s CPU time, 243.3M memory peak. Mar 17 17:58:58.579207 systemd-logind[1507]: Session 7 logged out. Waiting for processes to exit. Mar 17 17:58:58.580044 systemd-logind[1507]: Removed session 7. Mar 17 17:58:59.404652 kubelet[2721]: E0317 17:58:59.404603 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:00.307897 kubelet[2721]: E0317 17:59:00.307866 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:00.588634 update_engine[1512]: I20250317 17:59:00.588443 1512 update_attempter.cc:509] Updating boot flags... Mar 17 17:59:00.617688 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 42 scanned by (udev-worker) (2823) Mar 17 17:59:00.657613 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 42 scanned by (udev-worker) (2821) Mar 17 17:59:05.816595 kubelet[2721]: E0317 17:59:05.816538 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:09.152835 kubelet[2721]: I0317 17:59:09.152260 2721 topology_manager.go:215] "Topology Admit Handler" podUID="9ae1e580-be8a-4d4a-914d-8e5a51459913" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-fbscv" Mar 17 17:59:09.162817 kubelet[2721]: W0317 17:59:09.162775 2721 reflector.go:547] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Mar 17 17:59:09.162817 kubelet[2721]: E0317 17:59:09.162817 2721 reflector.go:150] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Mar 17 17:59:09.165047 systemd[1]: Created slice kubepods-besteffort-pod9ae1e580_be8a_4d4a_914d_8e5a51459913.slice - libcontainer container kubepods-besteffort-pod9ae1e580_be8a_4d4a_914d_8e5a51459913.slice. Mar 17 17:59:09.187786 kubelet[2721]: I0317 17:59:09.187747 2721 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 17:59:09.188280 containerd[1527]: time="2025-03-17T17:59:09.188225421Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 17:59:09.188679 kubelet[2721]: I0317 17:59:09.188608 2721 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 17:59:09.188910 kubelet[2721]: I0317 17:59:09.188825 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9ae1e580-be8a-4d4a-914d-8e5a51459913-var-lib-calico\") pod \"tigera-operator-6479d6dc54-fbscv\" (UID: \"9ae1e580-be8a-4d4a-914d-8e5a51459913\") " pod="tigera-operator/tigera-operator-6479d6dc54-fbscv" Mar 17 17:59:09.188910 kubelet[2721]: I0317 17:59:09.188850 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rms2q\" (UniqueName: \"kubernetes.io/projected/9ae1e580-be8a-4d4a-914d-8e5a51459913-kube-api-access-rms2q\") pod \"tigera-operator-6479d6dc54-fbscv\" (UID: \"9ae1e580-be8a-4d4a-914d-8e5a51459913\") " pod="tigera-operator/tigera-operator-6479d6dc54-fbscv" Mar 17 17:59:09.199384 kubelet[2721]: I0317 17:59:09.199329 2721 topology_manager.go:215] "Topology Admit Handler" podUID="25a8e823-b5c0-4dde-bd7e-a28c0606d3b9" podNamespace="kube-system" podName="kube-proxy-lgjtt" Mar 17 17:59:09.209018 systemd[1]: Created slice kubepods-besteffort-pod25a8e823_b5c0_4dde_bd7e_a28c0606d3b9.slice - libcontainer container kubepods-besteffort-pod25a8e823_b5c0_4dde_bd7e_a28c0606d3b9.slice. Mar 17 17:59:09.289810 kubelet[2721]: I0317 17:59:09.289773 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/25a8e823-b5c0-4dde-bd7e-a28c0606d3b9-xtables-lock\") pod \"kube-proxy-lgjtt\" (UID: \"25a8e823-b5c0-4dde-bd7e-a28c0606d3b9\") " pod="kube-system/kube-proxy-lgjtt" Mar 17 17:59:09.289810 kubelet[2721]: I0317 17:59:09.289804 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/25a8e823-b5c0-4dde-bd7e-a28c0606d3b9-lib-modules\") pod \"kube-proxy-lgjtt\" (UID: \"25a8e823-b5c0-4dde-bd7e-a28c0606d3b9\") " pod="kube-system/kube-proxy-lgjtt" Mar 17 17:59:09.289939 kubelet[2721]: I0317 17:59:09.289823 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsqqv\" (UniqueName: \"kubernetes.io/projected/25a8e823-b5c0-4dde-bd7e-a28c0606d3b9-kube-api-access-jsqqv\") pod \"kube-proxy-lgjtt\" (UID: \"25a8e823-b5c0-4dde-bd7e-a28c0606d3b9\") " pod="kube-system/kube-proxy-lgjtt" Mar 17 17:59:09.289939 kubelet[2721]: I0317 17:59:09.289847 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/25a8e823-b5c0-4dde-bd7e-a28c0606d3b9-kube-proxy\") pod \"kube-proxy-lgjtt\" (UID: \"25a8e823-b5c0-4dde-bd7e-a28c0606d3b9\") " pod="kube-system/kube-proxy-lgjtt" Mar 17 17:59:09.475208 containerd[1527]: time="2025-03-17T17:59:09.475052479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-fbscv,Uid:9ae1e580-be8a-4d4a-914d-8e5a51459913,Namespace:tigera-operator,Attempt:0,}" Mar 17 17:59:09.503974 containerd[1527]: time="2025-03-17T17:59:09.503858519Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:09.503974 containerd[1527]: time="2025-03-17T17:59:09.503922814Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:09.503974 containerd[1527]: time="2025-03-17T17:59:09.503936503Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:09.504179 containerd[1527]: time="2025-03-17T17:59:09.504020999Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:09.518419 kubelet[2721]: E0317 17:59:09.518378 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:09.519095 containerd[1527]: time="2025-03-17T17:59:09.518886930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lgjtt,Uid:25a8e823-b5c0-4dde-bd7e-a28c0606d3b9,Namespace:kube-system,Attempt:0,}" Mar 17 17:59:09.526789 systemd[1]: Started cri-containerd-c142c0394e103e51340d80c6c1c9a3f166c01ac19e64d5050f196ebbe1951343.scope - libcontainer container c142c0394e103e51340d80c6c1c9a3f166c01ac19e64d5050f196ebbe1951343. Mar 17 17:59:09.546998 containerd[1527]: time="2025-03-17T17:59:09.546809498Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:09.546998 containerd[1527]: time="2025-03-17T17:59:09.546956355Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:09.546998 containerd[1527]: time="2025-03-17T17:59:09.546994314Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:09.547161 containerd[1527]: time="2025-03-17T17:59:09.547091757Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:09.569706 systemd[1]: Started cri-containerd-bb6f35b6c0f5763faf013f04a7413fe65114ed9c06697841feaea0222abb1aba.scope - libcontainer container bb6f35b6c0f5763faf013f04a7413fe65114ed9c06697841feaea0222abb1aba. Mar 17 17:59:09.572357 containerd[1527]: time="2025-03-17T17:59:09.572316898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-fbscv,Uid:9ae1e580-be8a-4d4a-914d-8e5a51459913,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c142c0394e103e51340d80c6c1c9a3f166c01ac19e64d5050f196ebbe1951343\"" Mar 17 17:59:09.574876 containerd[1527]: time="2025-03-17T17:59:09.574798024Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 17 17:59:09.595462 containerd[1527]: time="2025-03-17T17:59:09.595340938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lgjtt,Uid:25a8e823-b5c0-4dde-bd7e-a28c0606d3b9,Namespace:kube-system,Attempt:0,} returns sandbox id \"bb6f35b6c0f5763faf013f04a7413fe65114ed9c06697841feaea0222abb1aba\"" Mar 17 17:59:09.596017 kubelet[2721]: E0317 17:59:09.595991 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:09.600800 containerd[1527]: time="2025-03-17T17:59:09.600770444Z" level=info msg="CreateContainer within sandbox \"bb6f35b6c0f5763faf013f04a7413fe65114ed9c06697841feaea0222abb1aba\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 17:59:09.616785 containerd[1527]: time="2025-03-17T17:59:09.616713151Z" level=info msg="CreateContainer within sandbox \"bb6f35b6c0f5763faf013f04a7413fe65114ed9c06697841feaea0222abb1aba\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"35977ee3380a3bc176c8e748fc9ee98b8b866a5057d96e40de4c011b087824d1\"" Mar 17 17:59:09.617447 containerd[1527]: time="2025-03-17T17:59:09.617394382Z" level=info msg="StartContainer for \"35977ee3380a3bc176c8e748fc9ee98b8b866a5057d96e40de4c011b087824d1\"" Mar 17 17:59:09.654826 systemd[1]: Started cri-containerd-35977ee3380a3bc176c8e748fc9ee98b8b866a5057d96e40de4c011b087824d1.scope - libcontainer container 35977ee3380a3bc176c8e748fc9ee98b8b866a5057d96e40de4c011b087824d1. Mar 17 17:59:09.688151 containerd[1527]: time="2025-03-17T17:59:09.688109729Z" level=info msg="StartContainer for \"35977ee3380a3bc176c8e748fc9ee98b8b866a5057d96e40de4c011b087824d1\" returns successfully" Mar 17 17:59:10.318987 kubelet[2721]: E0317 17:59:10.318960 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:10.326107 kubelet[2721]: I0317 17:59:10.326043 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lgjtt" podStartSLOduration=1.326025248 podStartE2EDuration="1.326025248s" podCreationTimestamp="2025-03-17 17:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:59:10.325718551 +0000 UTC m=+19.116561817" watchObservedRunningTime="2025-03-17 17:59:10.326025248 +0000 UTC m=+19.116868514" Mar 17 17:59:11.230253 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3114082845.mount: Deactivated successfully. Mar 17 17:59:12.085644 containerd[1527]: time="2025-03-17T17:59:12.085588381Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:12.086383 containerd[1527]: time="2025-03-17T17:59:12.086329259Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 17 17:59:12.087487 containerd[1527]: time="2025-03-17T17:59:12.087459730Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:12.089969 containerd[1527]: time="2025-03-17T17:59:12.089935333Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:12.090731 containerd[1527]: time="2025-03-17T17:59:12.090702675Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 2.515862594s" Mar 17 17:59:12.090775 containerd[1527]: time="2025-03-17T17:59:12.090730692Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 17 17:59:12.095929 containerd[1527]: time="2025-03-17T17:59:12.095889879Z" level=info msg="CreateContainer within sandbox \"c142c0394e103e51340d80c6c1c9a3f166c01ac19e64d5050f196ebbe1951343\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 17:59:12.108911 containerd[1527]: time="2025-03-17T17:59:12.108858695Z" level=info msg="CreateContainer within sandbox \"c142c0394e103e51340d80c6c1c9a3f166c01ac19e64d5050f196ebbe1951343\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c60aff302815b33311ee075253b607f89f78d677324fd501c06da84bee510ca2\"" Mar 17 17:59:12.110209 containerd[1527]: time="2025-03-17T17:59:12.109365771Z" level=info msg="StartContainer for \"c60aff302815b33311ee075253b607f89f78d677324fd501c06da84bee510ca2\"" Mar 17 17:59:12.138751 systemd[1]: Started cri-containerd-c60aff302815b33311ee075253b607f89f78d677324fd501c06da84bee510ca2.scope - libcontainer container c60aff302815b33311ee075253b607f89f78d677324fd501c06da84bee510ca2. Mar 17 17:59:12.166371 containerd[1527]: time="2025-03-17T17:59:12.166331750Z" level=info msg="StartContainer for \"c60aff302815b33311ee075253b607f89f78d677324fd501c06da84bee510ca2\" returns successfully" Mar 17 17:59:15.052167 kubelet[2721]: I0317 17:59:15.050558 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-fbscv" podStartSLOduration=3.530797021 podStartE2EDuration="6.050525337s" podCreationTimestamp="2025-03-17 17:59:09 +0000 UTC" firstStartedPulling="2025-03-17 17:59:09.573865118 +0000 UTC m=+18.364708384" lastFinishedPulling="2025-03-17 17:59:12.093593434 +0000 UTC m=+20.884436700" observedRunningTime="2025-03-17 17:59:12.33589632 +0000 UTC m=+21.126739586" watchObservedRunningTime="2025-03-17 17:59:15.050525337 +0000 UTC m=+23.841368603" Mar 17 17:59:15.054042 kubelet[2721]: I0317 17:59:15.053465 2721 topology_manager.go:215] "Topology Admit Handler" podUID="0d254ee3-70bb-43a2-a894-a8d9916f5164" podNamespace="calico-system" podName="calico-typha-6ddcfb48b4-jh44k" Mar 17 17:59:15.066711 systemd[1]: Created slice kubepods-besteffort-pod0d254ee3_70bb_43a2_a894_a8d9916f5164.slice - libcontainer container kubepods-besteffort-pod0d254ee3_70bb_43a2_a894_a8d9916f5164.slice. Mar 17 17:59:15.116404 kubelet[2721]: I0317 17:59:15.116118 2721 topology_manager.go:215] "Topology Admit Handler" podUID="e803e84a-5e61-46fb-91a6-3411250f4200" podNamespace="calico-system" podName="calico-node-r9mrl" Mar 17 17:59:15.128428 systemd[1]: Created slice kubepods-besteffort-pode803e84a_5e61_46fb_91a6_3411250f4200.slice - libcontainer container kubepods-besteffort-pode803e84a_5e61_46fb_91a6_3411250f4200.slice. Mar 17 17:59:15.129709 kubelet[2721]: I0317 17:59:15.129657 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-bin-dir\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129709 kubelet[2721]: I0317 17:59:15.129685 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0d254ee3-70bb-43a2-a894-a8d9916f5164-typha-certs\") pod \"calico-typha-6ddcfb48b4-jh44k\" (UID: \"0d254ee3-70bb-43a2-a894-a8d9916f5164\") " pod="calico-system/calico-typha-6ddcfb48b4-jh44k" Mar 17 17:59:15.129709 kubelet[2721]: I0317 17:59:15.129702 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d254ee3-70bb-43a2-a894-a8d9916f5164-tigera-ca-bundle\") pod \"calico-typha-6ddcfb48b4-jh44k\" (UID: \"0d254ee3-70bb-43a2-a894-a8d9916f5164\") " pod="calico-system/calico-typha-6ddcfb48b4-jh44k" Mar 17 17:59:15.129812 kubelet[2721]: I0317 17:59:15.129719 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tclpd\" (UniqueName: \"kubernetes.io/projected/0d254ee3-70bb-43a2-a894-a8d9916f5164-kube-api-access-tclpd\") pod \"calico-typha-6ddcfb48b4-jh44k\" (UID: \"0d254ee3-70bb-43a2-a894-a8d9916f5164\") " pod="calico-system/calico-typha-6ddcfb48b4-jh44k" Mar 17 17:59:15.129812 kubelet[2721]: I0317 17:59:15.129734 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pxfm\" (UniqueName: \"kubernetes.io/projected/e803e84a-5e61-46fb-91a6-3411250f4200-kube-api-access-5pxfm\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129812 kubelet[2721]: I0317 17:59:15.129748 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-policysync\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129812 kubelet[2721]: I0317 17:59:15.129762 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e803e84a-5e61-46fb-91a6-3411250f4200-node-certs\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129812 kubelet[2721]: I0317 17:59:15.129776 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-lib-modules\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129933 kubelet[2721]: I0317 17:59:15.129791 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-xtables-lock\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129933 kubelet[2721]: I0317 17:59:15.129812 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-lib-calico\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129933 kubelet[2721]: I0317 17:59:15.129827 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-log-dir\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129933 kubelet[2721]: I0317 17:59:15.129842 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e803e84a-5e61-46fb-91a6-3411250f4200-tigera-ca-bundle\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.129933 kubelet[2721]: I0317 17:59:15.129856 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-net-dir\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.130065 kubelet[2721]: I0317 17:59:15.129871 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-run-calico\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.130065 kubelet[2721]: I0317 17:59:15.129884 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-flexvol-driver-host\") pod \"calico-node-r9mrl\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " pod="calico-system/calico-node-r9mrl" Mar 17 17:59:15.226713 kubelet[2721]: I0317 17:59:15.226270 2721 topology_manager.go:215] "Topology Admit Handler" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" podNamespace="calico-system" podName="csi-node-driver-kdwpf" Mar 17 17:59:15.226713 kubelet[2721]: E0317 17:59:15.226530 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:15.246623 kubelet[2721]: E0317 17:59:15.243719 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.246623 kubelet[2721]: W0317 17:59:15.243745 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.246623 kubelet[2721]: E0317 17:59:15.243764 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.246623 kubelet[2721]: E0317 17:59:15.244653 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.246623 kubelet[2721]: W0317 17:59:15.244700 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.246623 kubelet[2721]: E0317 17:59:15.244723 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.246623 kubelet[2721]: E0317 17:59:15.244957 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.246623 kubelet[2721]: W0317 17:59:15.244965 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.246623 kubelet[2721]: E0317 17:59:15.244985 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.246623 kubelet[2721]: E0317 17:59:15.245202 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.246986 kubelet[2721]: W0317 17:59:15.245210 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.246986 kubelet[2721]: E0317 17:59:15.245230 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.246986 kubelet[2721]: E0317 17:59:15.245417 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.246986 kubelet[2721]: W0317 17:59:15.245424 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.246986 kubelet[2721]: E0317 17:59:15.245436 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.246986 kubelet[2721]: E0317 17:59:15.246653 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.246986 kubelet[2721]: W0317 17:59:15.246661 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.246986 kubelet[2721]: E0317 17:59:15.246685 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.246986 kubelet[2721]: E0317 17:59:15.246903 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.246986 kubelet[2721]: W0317 17:59:15.246910 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.247203 kubelet[2721]: E0317 17:59:15.246960 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.247203 kubelet[2721]: E0317 17:59:15.247111 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.247203 kubelet[2721]: W0317 17:59:15.247119 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.247203 kubelet[2721]: E0317 17:59:15.247189 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.247323 kubelet[2721]: E0317 17:59:15.247303 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.247323 kubelet[2721]: W0317 17:59:15.247318 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.247410 kubelet[2721]: E0317 17:59:15.247390 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.247591 kubelet[2721]: E0317 17:59:15.247549 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.247591 kubelet[2721]: W0317 17:59:15.247563 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.247808 kubelet[2721]: E0317 17:59:15.247691 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.247850 kubelet[2721]: E0317 17:59:15.247839 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.247850 kubelet[2721]: W0317 17:59:15.247846 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.247939 kubelet[2721]: E0317 17:59:15.247919 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.248061 kubelet[2721]: E0317 17:59:15.248041 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.248061 kubelet[2721]: W0317 17:59:15.248055 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.248130 kubelet[2721]: E0317 17:59:15.248111 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.249823 kubelet[2721]: E0317 17:59:15.248318 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.249823 kubelet[2721]: W0317 17:59:15.248331 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.249823 kubelet[2721]: E0317 17:59:15.248421 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.249823 kubelet[2721]: E0317 17:59:15.248562 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.249823 kubelet[2721]: W0317 17:59:15.248590 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.249823 kubelet[2721]: E0317 17:59:15.248641 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.249823 kubelet[2721]: E0317 17:59:15.248858 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.249823 kubelet[2721]: W0317 17:59:15.248865 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.249823 kubelet[2721]: E0317 17:59:15.248877 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.249823 kubelet[2721]: E0317 17:59:15.249417 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.250083 kubelet[2721]: W0317 17:59:15.249425 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.250083 kubelet[2721]: E0317 17:59:15.249435 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.250083 kubelet[2721]: E0317 17:59:15.249817 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.250083 kubelet[2721]: W0317 17:59:15.249827 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.250332 kubelet[2721]: E0317 17:59:15.250290 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.253561 kubelet[2721]: E0317 17:59:15.252559 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.253561 kubelet[2721]: W0317 17:59:15.252786 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.253561 kubelet[2721]: E0317 17:59:15.252983 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.255045 kubelet[2721]: E0317 17:59:15.255021 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.255045 kubelet[2721]: W0317 17:59:15.255038 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.255126 kubelet[2721]: E0317 17:59:15.255056 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.255783 kubelet[2721]: E0317 17:59:15.255745 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.255783 kubelet[2721]: W0317 17:59:15.255777 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.255849 kubelet[2721]: E0317 17:59:15.255802 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.257908 kubelet[2721]: E0317 17:59:15.257656 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.257908 kubelet[2721]: W0317 17:59:15.257673 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.257908 kubelet[2721]: E0317 17:59:15.257685 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.267778 kubelet[2721]: E0317 17:59:15.267748 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.267778 kubelet[2721]: W0317 17:59:15.267774 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.267849 kubelet[2721]: E0317 17:59:15.267793 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.319160 kubelet[2721]: E0317 17:59:15.319013 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.319160 kubelet[2721]: W0317 17:59:15.319038 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.319160 kubelet[2721]: E0317 17:59:15.319057 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.319344 kubelet[2721]: E0317 17:59:15.319249 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.319344 kubelet[2721]: W0317 17:59:15.319256 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.319344 kubelet[2721]: E0317 17:59:15.319264 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.320463 kubelet[2721]: E0317 17:59:15.319470 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.320463 kubelet[2721]: W0317 17:59:15.319484 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.320463 kubelet[2721]: E0317 17:59:15.319550 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.320463 kubelet[2721]: E0317 17:59:15.319846 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.320463 kubelet[2721]: W0317 17:59:15.319855 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.320463 kubelet[2721]: E0317 17:59:15.319864 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.320463 kubelet[2721]: E0317 17:59:15.320112 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.320463 kubelet[2721]: W0317 17:59:15.320120 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.320463 kubelet[2721]: E0317 17:59:15.320128 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.320463 kubelet[2721]: E0317 17:59:15.320453 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.320736 kubelet[2721]: W0317 17:59:15.320461 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.320736 kubelet[2721]: E0317 17:59:15.320472 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.322048 kubelet[2721]: E0317 17:59:15.321784 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.322048 kubelet[2721]: W0317 17:59:15.321799 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.322048 kubelet[2721]: E0317 17:59:15.321810 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.322048 kubelet[2721]: E0317 17:59:15.322038 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.322048 kubelet[2721]: W0317 17:59:15.322046 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.322312 kubelet[2721]: E0317 17:59:15.322055 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.322312 kubelet[2721]: E0317 17:59:15.322267 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.322312 kubelet[2721]: W0317 17:59:15.322274 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.322312 kubelet[2721]: E0317 17:59:15.322282 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.322716 kubelet[2721]: E0317 17:59:15.322446 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.322716 kubelet[2721]: W0317 17:59:15.322456 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.322716 kubelet[2721]: E0317 17:59:15.322463 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.322716 kubelet[2721]: E0317 17:59:15.322662 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.322716 kubelet[2721]: W0317 17:59:15.322669 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.322716 kubelet[2721]: E0317 17:59:15.322677 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.322873 kubelet[2721]: E0317 17:59:15.322855 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.322873 kubelet[2721]: W0317 17:59:15.322863 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.322873 kubelet[2721]: E0317 17:59:15.322870 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.323288 kubelet[2721]: E0317 17:59:15.323053 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.323288 kubelet[2721]: W0317 17:59:15.323064 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.323288 kubelet[2721]: E0317 17:59:15.323071 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.323288 kubelet[2721]: E0317 17:59:15.323256 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.323288 kubelet[2721]: W0317 17:59:15.323262 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.323288 kubelet[2721]: E0317 17:59:15.323270 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.323461 kubelet[2721]: E0317 17:59:15.323446 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.323461 kubelet[2721]: W0317 17:59:15.323453 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.323461 kubelet[2721]: E0317 17:59:15.323460 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.323970 kubelet[2721]: E0317 17:59:15.323696 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.323970 kubelet[2721]: W0317 17:59:15.323708 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.323970 kubelet[2721]: E0317 17:59:15.323717 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.324699 kubelet[2721]: E0317 17:59:15.324671 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.324731 kubelet[2721]: W0317 17:59:15.324699 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.324731 kubelet[2721]: E0317 17:59:15.324724 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.325059 kubelet[2721]: E0317 17:59:15.325039 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.325059 kubelet[2721]: W0317 17:59:15.325051 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.325125 kubelet[2721]: E0317 17:59:15.325059 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.325318 kubelet[2721]: E0317 17:59:15.325297 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.325318 kubelet[2721]: W0317 17:59:15.325308 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.325318 kubelet[2721]: E0317 17:59:15.325317 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.325836 kubelet[2721]: E0317 17:59:15.325815 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.325836 kubelet[2721]: W0317 17:59:15.325827 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.325836 kubelet[2721]: E0317 17:59:15.325837 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.331837 kubelet[2721]: E0317 17:59:15.331809 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.331837 kubelet[2721]: W0317 17:59:15.331825 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.331906 kubelet[2721]: E0317 17:59:15.331842 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.331906 kubelet[2721]: I0317 17:59:15.331865 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/64b24927-936e-4c9c-ae85-ea32b09f5a34-registration-dir\") pod \"csi-node-driver-kdwpf\" (UID: \"64b24927-936e-4c9c-ae85-ea32b09f5a34\") " pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:15.332123 kubelet[2721]: E0317 17:59:15.332097 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.332123 kubelet[2721]: W0317 17:59:15.332111 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.332174 kubelet[2721]: E0317 17:59:15.332129 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.332174 kubelet[2721]: I0317 17:59:15.332144 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/64b24927-936e-4c9c-ae85-ea32b09f5a34-varrun\") pod \"csi-node-driver-kdwpf\" (UID: \"64b24927-936e-4c9c-ae85-ea32b09f5a34\") " pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:15.332449 kubelet[2721]: E0317 17:59:15.332433 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.332449 kubelet[2721]: W0317 17:59:15.332445 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.332521 kubelet[2721]: E0317 17:59:15.332464 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.332521 kubelet[2721]: I0317 17:59:15.332477 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/64b24927-936e-4c9c-ae85-ea32b09f5a34-kubelet-dir\") pod \"csi-node-driver-kdwpf\" (UID: \"64b24927-936e-4c9c-ae85-ea32b09f5a34\") " pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:15.332783 kubelet[2721]: E0317 17:59:15.332760 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.332783 kubelet[2721]: W0317 17:59:15.332774 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.332843 kubelet[2721]: E0317 17:59:15.332793 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.332843 kubelet[2721]: I0317 17:59:15.332807 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/64b24927-936e-4c9c-ae85-ea32b09f5a34-socket-dir\") pod \"csi-node-driver-kdwpf\" (UID: \"64b24927-936e-4c9c-ae85-ea32b09f5a34\") " pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:15.333085 kubelet[2721]: E0317 17:59:15.333064 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.333085 kubelet[2721]: W0317 17:59:15.333077 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.333129 kubelet[2721]: E0317 17:59:15.333096 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.333333 kubelet[2721]: E0317 17:59:15.333313 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.333333 kubelet[2721]: W0317 17:59:15.333326 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.333374 kubelet[2721]: E0317 17:59:15.333344 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.333596 kubelet[2721]: E0317 17:59:15.333563 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.333636 kubelet[2721]: W0317 17:59:15.333613 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.333662 kubelet[2721]: E0317 17:59:15.333637 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.333870 kubelet[2721]: E0317 17:59:15.333856 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.333870 kubelet[2721]: W0317 17:59:15.333867 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.334067 kubelet[2721]: E0317 17:59:15.333976 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.334109 kubelet[2721]: E0317 17:59:15.334083 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.334109 kubelet[2721]: W0317 17:59:15.334090 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.334260 kubelet[2721]: E0317 17:59:15.334159 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.334335 kubelet[2721]: E0317 17:59:15.334320 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.334335 kubelet[2721]: W0317 17:59:15.334332 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.334491 kubelet[2721]: E0317 17:59:15.334408 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.334491 kubelet[2721]: I0317 17:59:15.334427 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpjrs\" (UniqueName: \"kubernetes.io/projected/64b24927-936e-4c9c-ae85-ea32b09f5a34-kube-api-access-bpjrs\") pod \"csi-node-driver-kdwpf\" (UID: \"64b24927-936e-4c9c-ae85-ea32b09f5a34\") " pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:15.334550 kubelet[2721]: E0317 17:59:15.334526 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.334550 kubelet[2721]: W0317 17:59:15.334532 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.334635 kubelet[2721]: E0317 17:59:15.334618 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.334813 kubelet[2721]: E0317 17:59:15.334799 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.334813 kubelet[2721]: W0317 17:59:15.334811 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.334862 kubelet[2721]: E0317 17:59:15.334819 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.335080 kubelet[2721]: E0317 17:59:15.335066 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.335080 kubelet[2721]: W0317 17:59:15.335078 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.335121 kubelet[2721]: E0317 17:59:15.335089 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.335274 kubelet[2721]: E0317 17:59:15.335259 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.335274 kubelet[2721]: W0317 17:59:15.335270 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.335333 kubelet[2721]: E0317 17:59:15.335278 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.335492 kubelet[2721]: E0317 17:59:15.335475 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.335492 kubelet[2721]: W0317 17:59:15.335487 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.335492 kubelet[2721]: E0317 17:59:15.335495 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.375270 kubelet[2721]: E0317 17:59:15.375235 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:15.376126 containerd[1527]: time="2025-03-17T17:59:15.375786274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6ddcfb48b4-jh44k,Uid:0d254ee3-70bb-43a2-a894-a8d9916f5164,Namespace:calico-system,Attempt:0,}" Mar 17 17:59:15.410972 containerd[1527]: time="2025-03-17T17:59:15.410853270Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:15.411702 containerd[1527]: time="2025-03-17T17:59:15.411445008Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:15.412870 containerd[1527]: time="2025-03-17T17:59:15.412720883Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:15.412870 containerd[1527]: time="2025-03-17T17:59:15.412813301Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:15.432824 kubelet[2721]: E0317 17:59:15.432589 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:15.433509 containerd[1527]: time="2025-03-17T17:59:15.433467785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r9mrl,Uid:e803e84a-5e61-46fb-91a6-3411250f4200,Namespace:calico-system,Attempt:0,}" Mar 17 17:59:15.434325 systemd[1]: Started cri-containerd-700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0.scope - libcontainer container 700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0. Mar 17 17:59:15.438256 kubelet[2721]: E0317 17:59:15.438073 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.438256 kubelet[2721]: W0317 17:59:15.438191 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.438256 kubelet[2721]: E0317 17:59:15.438208 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.439592 kubelet[2721]: E0317 17:59:15.438943 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.439592 kubelet[2721]: W0317 17:59:15.438954 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.439592 kubelet[2721]: E0317 17:59:15.438974 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.440355 kubelet[2721]: E0317 17:59:15.440331 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.440508 kubelet[2721]: W0317 17:59:15.440496 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.440615 kubelet[2721]: E0317 17:59:15.440603 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.441449 kubelet[2721]: E0317 17:59:15.441438 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.441657 kubelet[2721]: W0317 17:59:15.441628 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.441736 kubelet[2721]: E0317 17:59:15.441695 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.443069 kubelet[2721]: E0317 17:59:15.442974 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.443069 kubelet[2721]: W0317 17:59:15.443005 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.443069 kubelet[2721]: E0317 17:59:15.443015 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.443402 kubelet[2721]: E0317 17:59:15.443360 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.443402 kubelet[2721]: W0317 17:59:15.443388 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.443473 kubelet[2721]: E0317 17:59:15.443421 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.444665 kubelet[2721]: E0317 17:59:15.444634 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.444665 kubelet[2721]: W0317 17:59:15.444650 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.444918 kubelet[2721]: E0317 17:59:15.444838 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.444977 kubelet[2721]: E0317 17:59:15.444956 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.444977 kubelet[2721]: W0317 17:59:15.444974 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.445097 kubelet[2721]: E0317 17:59:15.445078 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.445288 kubelet[2721]: E0317 17:59:15.445273 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.445288 kubelet[2721]: W0317 17:59:15.445285 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.445602 kubelet[2721]: E0317 17:59:15.445509 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.445674 kubelet[2721]: E0317 17:59:15.445659 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.445674 kubelet[2721]: W0317 17:59:15.445671 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.445808 kubelet[2721]: E0317 17:59:15.445791 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.446068 kubelet[2721]: E0317 17:59:15.446048 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.446148 kubelet[2721]: W0317 17:59:15.446131 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.446246 kubelet[2721]: E0317 17:59:15.446213 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.446750 kubelet[2721]: E0317 17:59:15.446731 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.446750 kubelet[2721]: W0317 17:59:15.446747 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.446951 kubelet[2721]: E0317 17:59:15.446889 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.447067 kubelet[2721]: E0317 17:59:15.447050 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.447067 kubelet[2721]: W0317 17:59:15.447064 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.447148 kubelet[2721]: E0317 17:59:15.447122 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.447496 kubelet[2721]: E0317 17:59:15.447479 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.447496 kubelet[2721]: W0317 17:59:15.447495 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.447624 kubelet[2721]: E0317 17:59:15.447607 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.451257 kubelet[2721]: E0317 17:59:15.450793 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.451257 kubelet[2721]: W0317 17:59:15.450816 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.451257 kubelet[2721]: E0317 17:59:15.450925 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.451257 kubelet[2721]: E0317 17:59:15.451096 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.451257 kubelet[2721]: W0317 17:59:15.451112 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.451257 kubelet[2721]: E0317 17:59:15.451241 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.451670 kubelet[2721]: E0317 17:59:15.451648 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.451670 kubelet[2721]: W0317 17:59:15.451666 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.451810 kubelet[2721]: E0317 17:59:15.451786 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.452073 kubelet[2721]: E0317 17:59:15.452052 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.452073 kubelet[2721]: W0317 17:59:15.452068 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.452201 kubelet[2721]: E0317 17:59:15.452180 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.452401 kubelet[2721]: E0317 17:59:15.452378 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.452454 kubelet[2721]: W0317 17:59:15.452431 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.452530 kubelet[2721]: E0317 17:59:15.452507 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.452881 kubelet[2721]: E0317 17:59:15.452859 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.452881 kubelet[2721]: W0317 17:59:15.452876 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.452971 kubelet[2721]: E0317 17:59:15.452948 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.453214 kubelet[2721]: E0317 17:59:15.453194 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.453214 kubelet[2721]: W0317 17:59:15.453209 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.453355 kubelet[2721]: E0317 17:59:15.453332 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.453470 kubelet[2721]: E0317 17:59:15.453446 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.453502 kubelet[2721]: W0317 17:59:15.453462 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.453528 kubelet[2721]: E0317 17:59:15.453502 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.453903 kubelet[2721]: E0317 17:59:15.453871 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.453903 kubelet[2721]: W0317 17:59:15.453898 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.453965 kubelet[2721]: E0317 17:59:15.453932 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.454241 kubelet[2721]: E0317 17:59:15.454218 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.454241 kubelet[2721]: W0317 17:59:15.454236 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.455153 kubelet[2721]: E0317 17:59:15.454256 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.455153 kubelet[2721]: E0317 17:59:15.454631 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.455153 kubelet[2721]: W0317 17:59:15.454642 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.455153 kubelet[2721]: E0317 17:59:15.454653 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.463297 kubelet[2721]: E0317 17:59:15.463270 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:15.463297 kubelet[2721]: W0317 17:59:15.463292 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:15.463382 kubelet[2721]: E0317 17:59:15.463310 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:15.467494 containerd[1527]: time="2025-03-17T17:59:15.466799759Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:15.467494 containerd[1527]: time="2025-03-17T17:59:15.467465969Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:15.467494 containerd[1527]: time="2025-03-17T17:59:15.467479948Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:15.467722 containerd[1527]: time="2025-03-17T17:59:15.467623782Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:15.484716 systemd[1]: Started cri-containerd-09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df.scope - libcontainer container 09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df. Mar 17 17:59:15.492138 containerd[1527]: time="2025-03-17T17:59:15.492101249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6ddcfb48b4-jh44k,Uid:0d254ee3-70bb-43a2-a894-a8d9916f5164,Namespace:calico-system,Attempt:0,} returns sandbox id \"700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0\"" Mar 17 17:59:15.492670 kubelet[2721]: E0317 17:59:15.492648 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:15.493900 containerd[1527]: time="2025-03-17T17:59:15.493718281Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 17 17:59:15.510165 containerd[1527]: time="2025-03-17T17:59:15.510080894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-r9mrl,Uid:e803e84a-5e61-46fb-91a6-3411250f4200,Namespace:calico-system,Attempt:0,} returns sandbox id \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\"" Mar 17 17:59:15.510851 kubelet[2721]: E0317 17:59:15.510827 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:17.286320 kubelet[2721]: E0317 17:59:17.285982 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:17.969951 containerd[1527]: time="2025-03-17T17:59:17.969895795Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:17.970835 containerd[1527]: time="2025-03-17T17:59:17.970780522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 17 17:59:17.972066 containerd[1527]: time="2025-03-17T17:59:17.972029125Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:17.974103 containerd[1527]: time="2025-03-17T17:59:17.974066572Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:17.974738 containerd[1527]: time="2025-03-17T17:59:17.974683985Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 2.480938669s" Mar 17 17:59:17.974738 containerd[1527]: time="2025-03-17T17:59:17.974730690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 17 17:59:17.975862 containerd[1527]: time="2025-03-17T17:59:17.975812706Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 17:59:17.984238 containerd[1527]: time="2025-03-17T17:59:17.984201425Z" level=info msg="CreateContainer within sandbox \"700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 17:59:18.003564 containerd[1527]: time="2025-03-17T17:59:18.003516178Z" level=info msg="CreateContainer within sandbox \"700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\"" Mar 17 17:59:18.004621 containerd[1527]: time="2025-03-17T17:59:18.003864533Z" level=info msg="StartContainer for \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\"" Mar 17 17:59:18.030125 systemd[1]: Started cri-containerd-2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282.scope - libcontainer container 2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282. Mar 17 17:59:18.180837 containerd[1527]: time="2025-03-17T17:59:18.180775798Z" level=info msg="StartContainer for \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\" returns successfully" Mar 17 17:59:18.340323 kubelet[2721]: E0317 17:59:18.340188 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:18.349088 kubelet[2721]: E0317 17:59:18.349045 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.349088 kubelet[2721]: W0317 17:59:18.349072 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.349088 kubelet[2721]: E0317 17:59:18.349095 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.349347 kubelet[2721]: E0317 17:59:18.349312 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.349347 kubelet[2721]: W0317 17:59:18.349321 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.349347 kubelet[2721]: E0317 17:59:18.349329 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.349541 kubelet[2721]: E0317 17:59:18.349517 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.349541 kubelet[2721]: W0317 17:59:18.349528 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.349541 kubelet[2721]: E0317 17:59:18.349536 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.349785 kubelet[2721]: E0317 17:59:18.349762 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.349785 kubelet[2721]: W0317 17:59:18.349773 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.349785 kubelet[2721]: E0317 17:59:18.349782 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.350023 kubelet[2721]: E0317 17:59:18.349999 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.350023 kubelet[2721]: W0317 17:59:18.350010 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.350023 kubelet[2721]: E0317 17:59:18.350018 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.350216 kubelet[2721]: E0317 17:59:18.350203 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.350216 kubelet[2721]: W0317 17:59:18.350213 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.350300 kubelet[2721]: E0317 17:59:18.350221 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.350435 kubelet[2721]: E0317 17:59:18.350413 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.350435 kubelet[2721]: W0317 17:59:18.350423 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.350435 kubelet[2721]: E0317 17:59:18.350432 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.350714 kubelet[2721]: E0317 17:59:18.350700 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.350714 kubelet[2721]: W0317 17:59:18.350711 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.350762 kubelet[2721]: E0317 17:59:18.350720 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.350932 kubelet[2721]: E0317 17:59:18.350918 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.350932 kubelet[2721]: W0317 17:59:18.350929 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.351006 kubelet[2721]: E0317 17:59:18.350937 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.351278 kubelet[2721]: E0317 17:59:18.351255 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.351278 kubelet[2721]: W0317 17:59:18.351269 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.351328 kubelet[2721]: E0317 17:59:18.351278 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.351506 kubelet[2721]: E0317 17:59:18.351486 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.351506 kubelet[2721]: W0317 17:59:18.351499 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.351593 kubelet[2721]: E0317 17:59:18.351509 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.351724 kubelet[2721]: E0317 17:59:18.351710 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.351724 kubelet[2721]: W0317 17:59:18.351722 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.351788 kubelet[2721]: E0317 17:59:18.351731 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.351923 kubelet[2721]: E0317 17:59:18.351910 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.351923 kubelet[2721]: W0317 17:59:18.351920 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.351994 kubelet[2721]: E0317 17:59:18.351929 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.352174 kubelet[2721]: E0317 17:59:18.352159 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.352174 kubelet[2721]: W0317 17:59:18.352171 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.352232 kubelet[2721]: E0317 17:59:18.352180 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.352376 kubelet[2721]: E0317 17:59:18.352356 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.352376 kubelet[2721]: W0317 17:59:18.352369 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.352430 kubelet[2721]: E0317 17:59:18.352377 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.365181 kubelet[2721]: E0317 17:59:18.365133 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.365181 kubelet[2721]: W0317 17:59:18.365156 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.365181 kubelet[2721]: E0317 17:59:18.365175 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.365467 kubelet[2721]: E0317 17:59:18.365445 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.365467 kubelet[2721]: W0317 17:59:18.365456 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.365556 kubelet[2721]: E0317 17:59:18.365471 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.365736 kubelet[2721]: E0317 17:59:18.365715 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.365736 kubelet[2721]: W0317 17:59:18.365726 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.365813 kubelet[2721]: E0317 17:59:18.365739 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.366087 kubelet[2721]: E0317 17:59:18.366041 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.366087 kubelet[2721]: W0317 17:59:18.366073 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.366168 kubelet[2721]: E0317 17:59:18.366105 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.366353 kubelet[2721]: E0317 17:59:18.366329 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.366353 kubelet[2721]: W0317 17:59:18.366343 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.366442 kubelet[2721]: E0317 17:59:18.366364 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.366652 kubelet[2721]: E0317 17:59:18.366635 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.366652 kubelet[2721]: W0317 17:59:18.366647 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.366761 kubelet[2721]: E0317 17:59:18.366662 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.367010 kubelet[2721]: E0317 17:59:18.366966 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.367010 kubelet[2721]: W0317 17:59:18.366996 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.367133 kubelet[2721]: E0317 17:59:18.367073 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.367257 kubelet[2721]: E0317 17:59:18.367241 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.367295 kubelet[2721]: W0317 17:59:18.367257 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.367322 kubelet[2721]: E0317 17:59:18.367291 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.367476 kubelet[2721]: E0317 17:59:18.367460 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.367505 kubelet[2721]: W0317 17:59:18.367476 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.367505 kubelet[2721]: E0317 17:59:18.367495 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.367774 kubelet[2721]: E0317 17:59:18.367746 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.367774 kubelet[2721]: W0317 17:59:18.367772 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.367847 kubelet[2721]: E0317 17:59:18.367790 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.368026 kubelet[2721]: E0317 17:59:18.368010 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.368026 kubelet[2721]: W0317 17:59:18.368024 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.368091 kubelet[2721]: E0317 17:59:18.368042 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.368387 kubelet[2721]: E0317 17:59:18.368369 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.368439 kubelet[2721]: W0317 17:59:18.368388 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.368439 kubelet[2721]: E0317 17:59:18.368413 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.368723 kubelet[2721]: E0317 17:59:18.368701 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.368723 kubelet[2721]: W0317 17:59:18.368715 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.368839 kubelet[2721]: E0317 17:59:18.368731 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.369001 kubelet[2721]: E0317 17:59:18.368984 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.369001 kubelet[2721]: W0317 17:59:18.368996 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.369098 kubelet[2721]: E0317 17:59:18.369028 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.369274 kubelet[2721]: E0317 17:59:18.369252 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.369274 kubelet[2721]: W0317 17:59:18.369269 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.369349 kubelet[2721]: E0317 17:59:18.369309 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.369552 kubelet[2721]: E0317 17:59:18.369535 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.369552 kubelet[2721]: W0317 17:59:18.369550 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.369647 kubelet[2721]: E0317 17:59:18.369570 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.369911 kubelet[2721]: E0317 17:59:18.369894 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.369911 kubelet[2721]: W0317 17:59:18.369909 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.369966 kubelet[2721]: E0317 17:59:18.369927 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.370228 kubelet[2721]: E0317 17:59:18.370201 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:18.370228 kubelet[2721]: W0317 17:59:18.370216 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:18.370228 kubelet[2721]: E0317 17:59:18.370227 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:18.862727 kubelet[2721]: I0317 17:59:18.862655 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6ddcfb48b4-jh44k" podStartSLOduration=1.380651898 podStartE2EDuration="3.862638256s" podCreationTimestamp="2025-03-17 17:59:15 +0000 UTC" firstStartedPulling="2025-03-17 17:59:15.493538834 +0000 UTC m=+24.284382100" lastFinishedPulling="2025-03-17 17:59:17.975525192 +0000 UTC m=+26.766368458" observedRunningTime="2025-03-17 17:59:18.862515779 +0000 UTC m=+27.653359045" watchObservedRunningTime="2025-03-17 17:59:18.862638256 +0000 UTC m=+27.653481522" Mar 17 17:59:19.285953 kubelet[2721]: E0317 17:59:19.285505 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:19.341268 kubelet[2721]: I0317 17:59:19.341235 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:59:19.341839 kubelet[2721]: E0317 17:59:19.341819 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:19.356852 kubelet[2721]: E0317 17:59:19.356818 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.356852 kubelet[2721]: W0317 17:59:19.356838 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.356968 kubelet[2721]: E0317 17:59:19.356860 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.357110 kubelet[2721]: E0317 17:59:19.357077 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.357110 kubelet[2721]: W0317 17:59:19.357088 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.357110 kubelet[2721]: E0317 17:59:19.357107 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.357356 kubelet[2721]: E0317 17:59:19.357348 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.357387 kubelet[2721]: W0317 17:59:19.357357 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.357387 kubelet[2721]: E0317 17:59:19.357366 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.357617 kubelet[2721]: E0317 17:59:19.357592 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.357617 kubelet[2721]: W0317 17:59:19.357603 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.357617 kubelet[2721]: E0317 17:59:19.357612 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.357858 kubelet[2721]: E0317 17:59:19.357834 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.357858 kubelet[2721]: W0317 17:59:19.357845 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.357858 kubelet[2721]: E0317 17:59:19.357854 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.358080 kubelet[2721]: E0317 17:59:19.358055 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.358080 kubelet[2721]: W0317 17:59:19.358066 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.358080 kubelet[2721]: E0317 17:59:19.358075 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.358316 kubelet[2721]: E0317 17:59:19.358280 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.358316 kubelet[2721]: W0317 17:59:19.358302 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.358316 kubelet[2721]: E0317 17:59:19.358311 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.358550 kubelet[2721]: E0317 17:59:19.358522 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.358550 kubelet[2721]: W0317 17:59:19.358537 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.358550 kubelet[2721]: E0317 17:59:19.358548 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.358801 kubelet[2721]: E0317 17:59:19.358778 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.358801 kubelet[2721]: W0317 17:59:19.358790 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.358801 kubelet[2721]: E0317 17:59:19.358800 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.359005 kubelet[2721]: E0317 17:59:19.358993 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.359005 kubelet[2721]: W0317 17:59:19.359003 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.359076 kubelet[2721]: E0317 17:59:19.359012 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.359256 kubelet[2721]: E0317 17:59:19.359243 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.359256 kubelet[2721]: W0317 17:59:19.359254 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.359325 kubelet[2721]: E0317 17:59:19.359263 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.359513 kubelet[2721]: E0317 17:59:19.359499 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.359513 kubelet[2721]: W0317 17:59:19.359510 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.359596 kubelet[2721]: E0317 17:59:19.359519 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.359744 kubelet[2721]: E0317 17:59:19.359730 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.359744 kubelet[2721]: W0317 17:59:19.359741 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.359806 kubelet[2721]: E0317 17:59:19.359751 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.359955 kubelet[2721]: E0317 17:59:19.359942 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.359955 kubelet[2721]: W0317 17:59:19.359953 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.360037 kubelet[2721]: E0317 17:59:19.359962 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.360176 kubelet[2721]: E0317 17:59:19.360163 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.360176 kubelet[2721]: W0317 17:59:19.360174 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.360236 kubelet[2721]: E0317 17:59:19.360182 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.370442 kubelet[2721]: E0317 17:59:19.370421 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.370442 kubelet[2721]: W0317 17:59:19.370435 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.370529 kubelet[2721]: E0317 17:59:19.370445 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.370776 kubelet[2721]: E0317 17:59:19.370735 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.370776 kubelet[2721]: W0317 17:59:19.370766 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.370842 kubelet[2721]: E0317 17:59:19.370797 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.371149 kubelet[2721]: E0317 17:59:19.371126 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.371149 kubelet[2721]: W0317 17:59:19.371140 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.371231 kubelet[2721]: E0317 17:59:19.371157 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.371384 kubelet[2721]: E0317 17:59:19.371365 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.371384 kubelet[2721]: W0317 17:59:19.371376 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.371445 kubelet[2721]: E0317 17:59:19.371391 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.371627 kubelet[2721]: E0317 17:59:19.371613 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.371627 kubelet[2721]: W0317 17:59:19.371624 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.371690 kubelet[2721]: E0317 17:59:19.371638 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.371914 kubelet[2721]: E0317 17:59:19.371897 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.371914 kubelet[2721]: W0317 17:59:19.371911 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.371990 kubelet[2721]: E0317 17:59:19.371929 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.372196 kubelet[2721]: E0317 17:59:19.372180 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.372196 kubelet[2721]: W0317 17:59:19.372193 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.372269 kubelet[2721]: E0317 17:59:19.372224 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.372418 kubelet[2721]: E0317 17:59:19.372403 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.372418 kubelet[2721]: W0317 17:59:19.372414 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.372488 kubelet[2721]: E0317 17:59:19.372440 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.372666 kubelet[2721]: E0317 17:59:19.372649 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.372720 kubelet[2721]: W0317 17:59:19.372661 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.372720 kubelet[2721]: E0317 17:59:19.372686 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.372979 kubelet[2721]: E0317 17:59:19.372961 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.372979 kubelet[2721]: W0317 17:59:19.372973 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.373046 kubelet[2721]: E0317 17:59:19.372989 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.373232 kubelet[2721]: E0317 17:59:19.373215 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.373232 kubelet[2721]: W0317 17:59:19.373226 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.373305 kubelet[2721]: E0317 17:59:19.373238 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.373478 kubelet[2721]: E0317 17:59:19.373462 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.373478 kubelet[2721]: W0317 17:59:19.373472 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.373544 kubelet[2721]: E0317 17:59:19.373485 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.373788 kubelet[2721]: E0317 17:59:19.373772 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.373788 kubelet[2721]: W0317 17:59:19.373786 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.373861 kubelet[2721]: E0317 17:59:19.373803 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.374027 kubelet[2721]: E0317 17:59:19.374014 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.374027 kubelet[2721]: W0317 17:59:19.374025 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.374090 kubelet[2721]: E0317 17:59:19.374038 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.374273 kubelet[2721]: E0317 17:59:19.374257 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.374273 kubelet[2721]: W0317 17:59:19.374269 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.374362 kubelet[2721]: E0317 17:59:19.374284 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.374501 kubelet[2721]: E0317 17:59:19.374487 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.374501 kubelet[2721]: W0317 17:59:19.374498 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.374568 kubelet[2721]: E0317 17:59:19.374511 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.374762 kubelet[2721]: E0317 17:59:19.374746 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.374762 kubelet[2721]: W0317 17:59:19.374759 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.374841 kubelet[2721]: E0317 17:59:19.374769 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:19.375212 kubelet[2721]: E0317 17:59:19.375195 2721 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:59:19.375212 kubelet[2721]: W0317 17:59:19.375208 2721 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:59:19.375300 kubelet[2721]: E0317 17:59:19.375218 2721 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:59:20.998410 containerd[1527]: time="2025-03-17T17:59:20.998329022Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:20.999099 containerd[1527]: time="2025-03-17T17:59:20.999032259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 17 17:59:21.000239 containerd[1527]: time="2025-03-17T17:59:21.000200874Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:21.002361 containerd[1527]: time="2025-03-17T17:59:21.002325102Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:21.002814 containerd[1527]: time="2025-03-17T17:59:21.002771320Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 3.026927309s" Mar 17 17:59:21.002845 containerd[1527]: time="2025-03-17T17:59:21.002813584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 17 17:59:21.004854 containerd[1527]: time="2025-03-17T17:59:21.004822785Z" level=info msg="CreateContainer within sandbox \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:59:21.024687 containerd[1527]: time="2025-03-17T17:59:21.024636061Z" level=info msg="CreateContainer within sandbox \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174\"" Mar 17 17:59:21.025214 containerd[1527]: time="2025-03-17T17:59:21.025184324Z" level=info msg="StartContainer for \"6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174\"" Mar 17 17:59:21.060822 systemd[1]: Started cri-containerd-6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174.scope - libcontainer container 6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174. Mar 17 17:59:21.097702 containerd[1527]: time="2025-03-17T17:59:21.097642453Z" level=info msg="StartContainer for \"6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174\" returns successfully" Mar 17 17:59:21.110723 systemd[1]: cri-containerd-6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174.scope: Deactivated successfully. Mar 17 17:59:21.145844 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174-rootfs.mount: Deactivated successfully. Mar 17 17:59:21.285645 kubelet[2721]: E0317 17:59:21.285497 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:21.345757 kubelet[2721]: E0317 17:59:21.345728 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:21.941348 containerd[1527]: time="2025-03-17T17:59:21.941281876Z" level=info msg="shim disconnected" id=6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174 namespace=k8s.io Mar 17 17:59:21.941348 containerd[1527]: time="2025-03-17T17:59:21.941345303Z" level=warning msg="cleaning up after shim disconnected" id=6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174 namespace=k8s.io Mar 17 17:59:21.941348 containerd[1527]: time="2025-03-17T17:59:21.941355614Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:59:22.348910 kubelet[2721]: E0317 17:59:22.348771 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:22.349548 containerd[1527]: time="2025-03-17T17:59:22.349372012Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 17:59:23.286196 kubelet[2721]: E0317 17:59:23.286129 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:25.176397 systemd[1]: Started sshd@7-10.0.0.118:22-10.0.0.1:38342.service - OpenSSH per-connection server daemon (10.0.0.1:38342). Mar 17 17:59:25.222191 sshd[3485]: Accepted publickey for core from 10.0.0.1 port 38342 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:25.224455 sshd-session[3485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:25.229475 systemd-logind[1507]: New session 8 of user core. Mar 17 17:59:25.236971 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 17:59:25.287937 kubelet[2721]: E0317 17:59:25.287883 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:25.416055 sshd[3487]: Connection closed by 10.0.0.1 port 38342 Mar 17 17:59:25.416417 sshd-session[3485]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:25.420111 systemd[1]: sshd@7-10.0.0.118:22-10.0.0.1:38342.service: Deactivated successfully. Mar 17 17:59:25.422023 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 17:59:25.422821 systemd-logind[1507]: Session 8 logged out. Waiting for processes to exit. Mar 17 17:59:25.423978 systemd-logind[1507]: Removed session 8. Mar 17 17:59:27.286354 kubelet[2721]: E0317 17:59:27.286307 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:27.483341 kubelet[2721]: I0317 17:59:27.483294 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:59:27.498514 kubelet[2721]: E0317 17:59:27.498465 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:28.372868 kubelet[2721]: E0317 17:59:28.372822 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:29.208129 containerd[1527]: time="2025-03-17T17:59:29.208065448Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:29.213791 containerd[1527]: time="2025-03-17T17:59:29.213747329Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 17 17:59:29.217238 containerd[1527]: time="2025-03-17T17:59:29.217196114Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:29.222343 containerd[1527]: time="2025-03-17T17:59:29.222296801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:29.223106 containerd[1527]: time="2025-03-17T17:59:29.223055436Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 6.873638684s" Mar 17 17:59:29.223106 containerd[1527]: time="2025-03-17T17:59:29.223098322Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 17 17:59:29.225900 containerd[1527]: time="2025-03-17T17:59:29.225860966Z" level=info msg="CreateContainer within sandbox \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:59:29.286159 kubelet[2721]: E0317 17:59:29.286112 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:29.358263 containerd[1527]: time="2025-03-17T17:59:29.358188946Z" level=info msg="CreateContainer within sandbox \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709\"" Mar 17 17:59:29.359185 containerd[1527]: time="2025-03-17T17:59:29.359125054Z" level=info msg="StartContainer for \"d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709\"" Mar 17 17:59:29.395714 systemd[1]: Started cri-containerd-d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709.scope - libcontainer container d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709. Mar 17 17:59:29.888966 containerd[1527]: time="2025-03-17T17:59:29.888915942Z" level=info msg="StartContainer for \"d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709\" returns successfully" Mar 17 17:59:30.380993 kubelet[2721]: E0317 17:59:30.380940 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:30.437405 systemd[1]: Started sshd@8-10.0.0.118:22-10.0.0.1:50668.service - OpenSSH per-connection server daemon (10.0.0.1:50668). Mar 17 17:59:30.487030 sshd[3546]: Accepted publickey for core from 10.0.0.1 port 50668 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:30.488747 sshd-session[3546]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:30.492893 systemd-logind[1507]: New session 9 of user core. Mar 17 17:59:30.507771 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 17:59:30.971379 sshd[3548]: Connection closed by 10.0.0.1 port 50668 Mar 17 17:59:30.971752 sshd-session[3546]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:30.975057 systemd-logind[1507]: Session 9 logged out. Waiting for processes to exit. Mar 17 17:59:30.975376 systemd[1]: sshd@8-10.0.0.118:22-10.0.0.1:50668.service: Deactivated successfully. Mar 17 17:59:30.977969 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 17:59:30.980494 systemd-logind[1507]: Removed session 9. Mar 17 17:59:31.141251 systemd[1]: cri-containerd-d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709.scope: Deactivated successfully. Mar 17 17:59:31.141690 systemd[1]: cri-containerd-d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709.scope: Consumed 562ms CPU time, 163M memory peak, 4K read from disk, 154M written to disk. Mar 17 17:59:31.161974 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709-rootfs.mount: Deactivated successfully. Mar 17 17:59:31.227998 kubelet[2721]: I0317 17:59:31.227665 2721 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 17:59:31.294934 systemd[1]: Created slice kubepods-besteffort-pod64b24927_936e_4c9c_ae85_ea32b09f5a34.slice - libcontainer container kubepods-besteffort-pod64b24927_936e_4c9c_ae85_ea32b09f5a34.slice. Mar 17 17:59:31.295899 kubelet[2721]: I0317 17:59:31.295784 2721 topology_manager.go:215] "Topology Admit Handler" podUID="b92c66d3-2179-49b1-848f-58af8f588b93" podNamespace="kube-system" podName="coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:31.299652 containerd[1527]: time="2025-03-17T17:59:31.299615461Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:0,}" Mar 17 17:59:31.302429 systemd[1]: Created slice kubepods-burstable-podb92c66d3_2179_49b1_848f_58af8f588b93.slice - libcontainer container kubepods-burstable-podb92c66d3_2179_49b1_848f_58af8f588b93.slice. Mar 17 17:59:31.330988 kubelet[2721]: I0317 17:59:31.330919 2721 topology_manager.go:215] "Topology Admit Handler" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" podNamespace="calico-system" podName="calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:31.336297 systemd[1]: Created slice kubepods-besteffort-poda88959ff_624d_4e4a_bb08_2634d2121e9c.slice - libcontainer container kubepods-besteffort-poda88959ff_624d_4e4a_bb08_2634d2121e9c.slice. Mar 17 17:59:31.359638 kubelet[2721]: I0317 17:59:31.359596 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tj9\" (UniqueName: \"kubernetes.io/projected/b92c66d3-2179-49b1-848f-58af8f588b93-kube-api-access-b7tj9\") pod \"coredns-7db6d8ff4d-jhsn7\" (UID: \"b92c66d3-2179-49b1-848f-58af8f588b93\") " pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:31.359806 kubelet[2721]: I0317 17:59:31.359636 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87l2\" (UniqueName: \"kubernetes.io/projected/a88959ff-624d-4e4a-bb08-2634d2121e9c-kube-api-access-m87l2\") pod \"calico-kube-controllers-7bbc9d96f6-tnt5q\" (UID: \"a88959ff-624d-4e4a-bb08-2634d2121e9c\") " pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:31.362622 kubelet[2721]: I0317 17:59:31.359973 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88959ff-624d-4e4a-bb08-2634d2121e9c-tigera-ca-bundle\") pod \"calico-kube-controllers-7bbc9d96f6-tnt5q\" (UID: \"a88959ff-624d-4e4a-bb08-2634d2121e9c\") " pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:31.362622 kubelet[2721]: I0317 17:59:31.360039 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b92c66d3-2179-49b1-848f-58af8f588b93-config-volume\") pod \"coredns-7db6d8ff4d-jhsn7\" (UID: \"b92c66d3-2179-49b1-848f-58af8f588b93\") " pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:31.381926 kubelet[2721]: E0317 17:59:31.381904 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:31.412483 kubelet[2721]: I0317 17:59:31.412432 2721 topology_manager.go:215] "Topology Admit Handler" podUID="22ef1751-2cf9-4f31-9bc5-80f3ec9f309d" podNamespace="kube-system" podName="coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:31.412706 kubelet[2721]: I0317 17:59:31.412675 2721 topology_manager.go:215] "Topology Admit Handler" podUID="805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61" podNamespace="calico-apiserver" podName="calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:31.412811 kubelet[2721]: I0317 17:59:31.412791 2721 topology_manager.go:215] "Topology Admit Handler" podUID="0488d793-1538-4b9b-9365-7e94f58bafbf" podNamespace="calico-apiserver" podName="calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:31.416333 containerd[1527]: time="2025-03-17T17:59:31.416262380Z" level=info msg="shim disconnected" id=d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709 namespace=k8s.io Mar 17 17:59:31.416333 containerd[1527]: time="2025-03-17T17:59:31.416328070Z" level=warning msg="cleaning up after shim disconnected" id=d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709 namespace=k8s.io Mar 17 17:59:31.416333 containerd[1527]: time="2025-03-17T17:59:31.416338551Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:59:31.420956 systemd[1]: Created slice kubepods-burstable-pod22ef1751_2cf9_4f31_9bc5_80f3ec9f309d.slice - libcontainer container kubepods-burstable-pod22ef1751_2cf9_4f31_9bc5_80f3ec9f309d.slice. Mar 17 17:59:31.429630 systemd[1]: Created slice kubepods-besteffort-pod0488d793_1538_4b9b_9365_7e94f58bafbf.slice - libcontainer container kubepods-besteffort-pod0488d793_1538_4b9b_9365_7e94f58bafbf.slice. Mar 17 17:59:31.439884 systemd[1]: Created slice kubepods-besteffort-pod805bc4c9_fd3d_4dd2_8a5a_24bf9e34eb61.slice - libcontainer container kubepods-besteffort-pod805bc4c9_fd3d_4dd2_8a5a_24bf9e34eb61.slice. Mar 17 17:59:31.460360 kubelet[2721]: I0317 17:59:31.460309 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96rqb\" (UniqueName: \"kubernetes.io/projected/22ef1751-2cf9-4f31-9bc5-80f3ec9f309d-kube-api-access-96rqb\") pod \"coredns-7db6d8ff4d-fn5nz\" (UID: \"22ef1751-2cf9-4f31-9bc5-80f3ec9f309d\") " pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:31.460360 kubelet[2721]: I0317 17:59:31.460348 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ngp2\" (UniqueName: \"kubernetes.io/projected/805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61-kube-api-access-9ngp2\") pod \"calico-apiserver-5df8c6645f-9hw6m\" (UID: \"805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61\") " pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:31.460484 kubelet[2721]: I0317 17:59:31.460465 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0488d793-1538-4b9b-9365-7e94f58bafbf-calico-apiserver-certs\") pod \"calico-apiserver-5df8c6645f-bfh8t\" (UID: \"0488d793-1538-4b9b-9365-7e94f58bafbf\") " pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:31.460533 kubelet[2721]: I0317 17:59:31.460489 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8hwl\" (UniqueName: \"kubernetes.io/projected/0488d793-1538-4b9b-9365-7e94f58bafbf-kube-api-access-m8hwl\") pod \"calico-apiserver-5df8c6645f-bfh8t\" (UID: \"0488d793-1538-4b9b-9365-7e94f58bafbf\") " pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:31.460533 kubelet[2721]: I0317 17:59:31.460514 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22ef1751-2cf9-4f31-9bc5-80f3ec9f309d-config-volume\") pod \"coredns-7db6d8ff4d-fn5nz\" (UID: \"22ef1751-2cf9-4f31-9bc5-80f3ec9f309d\") " pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:31.460634 kubelet[2721]: I0317 17:59:31.460535 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61-calico-apiserver-certs\") pod \"calico-apiserver-5df8c6645f-9hw6m\" (UID: \"805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61\") " pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:31.560117 containerd[1527]: time="2025-03-17T17:59:31.559970655Z" level=error msg="Failed to destroy network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.560473 containerd[1527]: time="2025-03-17T17:59:31.560438602Z" level=error msg="encountered an error cleaning up failed sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.560533 containerd[1527]: time="2025-03-17T17:59:31.560511907Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.561632 kubelet[2721]: E0317 17:59:31.560763 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.562017 kubelet[2721]: E0317 17:59:31.561748 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:31.562017 kubelet[2721]: E0317 17:59:31.561780 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:31.562017 kubelet[2721]: E0317 17:59:31.561826 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:31.605614 kubelet[2721]: E0317 17:59:31.605529 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:31.606031 containerd[1527]: time="2025-03-17T17:59:31.605994639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:0,}" Mar 17 17:59:31.639728 containerd[1527]: time="2025-03-17T17:59:31.639676038Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:0,}" Mar 17 17:59:31.669794 containerd[1527]: time="2025-03-17T17:59:31.669744421Z" level=error msg="Failed to destroy network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.670165 containerd[1527]: time="2025-03-17T17:59:31.670140586Z" level=error msg="encountered an error cleaning up failed sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.670218 containerd[1527]: time="2025-03-17T17:59:31.670197038Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.670508 kubelet[2721]: E0317 17:59:31.670442 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.670602 kubelet[2721]: E0317 17:59:31.670524 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:31.670602 kubelet[2721]: E0317 17:59:31.670551 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:31.670693 kubelet[2721]: E0317 17:59:31.670618 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jhsn7" podUID="b92c66d3-2179-49b1-848f-58af8f588b93" Mar 17 17:59:31.707384 containerd[1527]: time="2025-03-17T17:59:31.707325744Z" level=error msg="Failed to destroy network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.707751 containerd[1527]: time="2025-03-17T17:59:31.707715035Z" level=error msg="encountered an error cleaning up failed sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.707790 containerd[1527]: time="2025-03-17T17:59:31.707770595Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.708090 kubelet[2721]: E0317 17:59:31.708020 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.708090 kubelet[2721]: E0317 17:59:31.708088 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:31.708251 kubelet[2721]: E0317 17:59:31.708114 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:31.708251 kubelet[2721]: E0317 17:59:31.708166 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" Mar 17 17:59:31.726658 kubelet[2721]: E0317 17:59:31.726625 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:31.727374 containerd[1527]: time="2025-03-17T17:59:31.727064358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:0,}" Mar 17 17:59:31.739014 containerd[1527]: time="2025-03-17T17:59:31.738980508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:59:31.746706 containerd[1527]: time="2025-03-17T17:59:31.746674016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:59:31.806843 containerd[1527]: time="2025-03-17T17:59:31.806767757Z" level=error msg="Failed to destroy network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.807198 containerd[1527]: time="2025-03-17T17:59:31.807173761Z" level=error msg="encountered an error cleaning up failed sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.807270 containerd[1527]: time="2025-03-17T17:59:31.807229331Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.808918 kubelet[2721]: E0317 17:59:31.807483 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.808918 kubelet[2721]: E0317 17:59:31.807565 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:31.808918 kubelet[2721]: E0317 17:59:31.807609 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:31.809081 kubelet[2721]: E0317 17:59:31.807655 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-fn5nz" podUID="22ef1751-2cf9-4f31-9bc5-80f3ec9f309d" Mar 17 17:59:31.815075 containerd[1527]: time="2025-03-17T17:59:31.814949000Z" level=error msg="Failed to destroy network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.815550 containerd[1527]: time="2025-03-17T17:59:31.815361417Z" level=error msg="encountered an error cleaning up failed sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.815550 containerd[1527]: time="2025-03-17T17:59:31.815470483Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.815805 kubelet[2721]: E0317 17:59:31.815764 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.815863 kubelet[2721]: E0317 17:59:31.815831 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:31.815863 kubelet[2721]: E0317 17:59:31.815852 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:31.815914 kubelet[2721]: E0317 17:59:31.815892 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" podUID="0488d793-1538-4b9b-9365-7e94f58bafbf" Mar 17 17:59:31.816105 containerd[1527]: time="2025-03-17T17:59:31.816069119Z" level=error msg="Failed to destroy network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.816486 containerd[1527]: time="2025-03-17T17:59:31.816455344Z" level=error msg="encountered an error cleaning up failed sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.816529 containerd[1527]: time="2025-03-17T17:59:31.816510392Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.816712 kubelet[2721]: E0317 17:59:31.816686 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:31.816752 kubelet[2721]: E0317 17:59:31.816723 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:31.816752 kubelet[2721]: E0317 17:59:31.816744 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:31.816808 kubelet[2721]: E0317 17:59:31.816779 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" podUID="805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61" Mar 17 17:59:32.168708 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb-shm.mount: Deactivated successfully. Mar 17 17:59:32.383812 kubelet[2721]: I0317 17:59:32.383784 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610" Mar 17 17:59:32.384342 containerd[1527]: time="2025-03-17T17:59:32.384285351Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" Mar 17 17:59:32.384983 containerd[1527]: time="2025-03-17T17:59:32.384492502Z" level=info msg="Ensure that sandbox 175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610 in task-service has been cleanup successfully" Mar 17 17:59:32.385014 kubelet[2721]: I0317 17:59:32.384562 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb" Mar 17 17:59:32.385277 containerd[1527]: time="2025-03-17T17:59:32.385249809Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" Mar 17 17:59:32.385973 containerd[1527]: time="2025-03-17T17:59:32.385950737Z" level=info msg="Ensure that sandbox 2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb in task-service has been cleanup successfully" Mar 17 17:59:32.386146 containerd[1527]: time="2025-03-17T17:59:32.385322554Z" level=info msg="TearDown network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" successfully" Mar 17 17:59:32.386146 containerd[1527]: time="2025-03-17T17:59:32.386081716Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" returns successfully" Mar 17 17:59:32.386263 kubelet[2721]: E0317 17:59:32.386252 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:32.386450 containerd[1527]: time="2025-03-17T17:59:32.386353904Z" level=info msg="TearDown network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" successfully" Mar 17 17:59:32.386450 containerd[1527]: time="2025-03-17T17:59:32.386372371Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" returns successfully" Mar 17 17:59:32.386633 containerd[1527]: time="2025-03-17T17:59:32.386500093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:1,}" Mar 17 17:59:32.387892 systemd[1]: run-netns-cni\x2d123c6b70\x2dbc1d\x2da422\x2da685\x2df8cea1f026a5.mount: Deactivated successfully. Mar 17 17:59:32.388389 kubelet[2721]: E0317 17:59:32.388262 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:32.388919 containerd[1527]: time="2025-03-17T17:59:32.388889832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:1,}" Mar 17 17:59:32.389283 containerd[1527]: time="2025-03-17T17:59:32.389256517Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 17:59:32.390292 kubelet[2721]: I0317 17:59:32.390263 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5" Mar 17 17:59:32.390780 containerd[1527]: time="2025-03-17T17:59:32.390749500Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" Mar 17 17:59:32.390968 containerd[1527]: time="2025-03-17T17:59:32.390932172Z" level=info msg="Ensure that sandbox ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5 in task-service has been cleanup successfully" Mar 17 17:59:32.391132 systemd[1]: run-netns-cni\x2d2677b72d\x2d4b74\x2d520c\x2d8c82\x2d5b1ff9f2bd04.mount: Deactivated successfully. Mar 17 17:59:32.392660 containerd[1527]: time="2025-03-17T17:59:32.391809648Z" level=info msg="TearDown network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" successfully" Mar 17 17:59:32.392660 containerd[1527]: time="2025-03-17T17:59:32.391833736Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" returns successfully" Mar 17 17:59:32.392660 containerd[1527]: time="2025-03-17T17:59:32.392190662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:59:32.392926 kubelet[2721]: I0317 17:59:32.392896 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664" Mar 17 17:59:32.393361 containerd[1527]: time="2025-03-17T17:59:32.393331609Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" Mar 17 17:59:32.393510 containerd[1527]: time="2025-03-17T17:59:32.393494872Z" level=info msg="Ensure that sandbox 47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664 in task-service has been cleanup successfully" Mar 17 17:59:32.393792 containerd[1527]: time="2025-03-17T17:59:32.393772802Z" level=info msg="TearDown network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" successfully" Mar 17 17:59:32.393792 containerd[1527]: time="2025-03-17T17:59:32.393789556Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" returns successfully" Mar 17 17:59:32.393933 kubelet[2721]: E0317 17:59:32.393915 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:32.394109 containerd[1527]: time="2025-03-17T17:59:32.394089760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:1,}" Mar 17 17:59:32.394351 kubelet[2721]: I0317 17:59:32.394336 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815" Mar 17 17:59:32.394353 systemd[1]: run-netns-cni\x2d2cdaa6e4\x2d8d2d\x2dc58f\x2d51ac\x2d2ad7fad5a266.mount: Deactivated successfully. Mar 17 17:59:32.395401 containerd[1527]: time="2025-03-17T17:59:32.394792861Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" Mar 17 17:59:32.395904 containerd[1527]: time="2025-03-17T17:59:32.395363680Z" level=info msg="Ensure that sandbox 03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815 in task-service has been cleanup successfully" Mar 17 17:59:32.396167 containerd[1527]: time="2025-03-17T17:59:32.396140236Z" level=info msg="TearDown network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" successfully" Mar 17 17:59:32.396443 kubelet[2721]: I0317 17:59:32.396403 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6" Mar 17 17:59:32.396874 containerd[1527]: time="2025-03-17T17:59:32.396162671Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" returns successfully" Mar 17 17:59:32.396970 systemd[1]: run-netns-cni\x2dc2d4b0a5\x2d217a\x2de6d3\x2d68fa\x2d2731647a76fc.mount: Deactivated successfully. Mar 17 17:59:32.398145 containerd[1527]: time="2025-03-17T17:59:32.397374398Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" Mar 17 17:59:32.398145 containerd[1527]: time="2025-03-17T17:59:32.397611126Z" level=info msg="Ensure that sandbox e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6 in task-service has been cleanup successfully" Mar 17 17:59:32.398145 containerd[1527]: time="2025-03-17T17:59:32.397977221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:59:32.398720 containerd[1527]: time="2025-03-17T17:59:32.398702185Z" level=info msg="TearDown network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" successfully" Mar 17 17:59:32.398782 containerd[1527]: time="2025-03-17T17:59:32.398769789Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" returns successfully" Mar 17 17:59:32.399983 containerd[1527]: time="2025-03-17T17:59:32.399946487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:1,}" Mar 17 17:59:32.531947 containerd[1527]: time="2025-03-17T17:59:32.531856908Z" level=error msg="Failed to destroy network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.532759 containerd[1527]: time="2025-03-17T17:59:32.532649517Z" level=error msg="encountered an error cleaning up failed sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.532924 containerd[1527]: time="2025-03-17T17:59:32.532896385Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.538012 kubelet[2721]: E0317 17:59:32.537970 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.538118 kubelet[2721]: E0317 17:59:32.538030 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:32.538118 kubelet[2721]: E0317 17:59:32.538054 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:32.538118 kubelet[2721]: E0317 17:59:32.538098 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jhsn7" podUID="b92c66d3-2179-49b1-848f-58af8f588b93" Mar 17 17:59:32.539735 containerd[1527]: time="2025-03-17T17:59:32.539626441Z" level=error msg="Failed to destroy network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.540116 containerd[1527]: time="2025-03-17T17:59:32.540011012Z" level=error msg="encountered an error cleaning up failed sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.540116 containerd[1527]: time="2025-03-17T17:59:32.540051722Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.540600 kubelet[2721]: E0317 17:59:32.540343 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.540600 kubelet[2721]: E0317 17:59:32.540417 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:32.540600 kubelet[2721]: E0317 17:59:32.540441 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:32.540743 kubelet[2721]: E0317 17:59:32.540543 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:32.543683 containerd[1527]: time="2025-03-17T17:59:32.543628909Z" level=error msg="Failed to destroy network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.544114 containerd[1527]: time="2025-03-17T17:59:32.544091635Z" level=error msg="encountered an error cleaning up failed sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.544201 containerd[1527]: time="2025-03-17T17:59:32.544184688Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.544905 kubelet[2721]: E0317 17:59:32.544445 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.544905 kubelet[2721]: E0317 17:59:32.544492 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:32.544905 kubelet[2721]: E0317 17:59:32.544510 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:32.545023 kubelet[2721]: E0317 17:59:32.544545 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" Mar 17 17:59:32.553472 containerd[1527]: time="2025-03-17T17:59:32.553299482Z" level=error msg="Failed to destroy network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.553925 containerd[1527]: time="2025-03-17T17:59:32.553892546Z" level=error msg="encountered an error cleaning up failed sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.554034 containerd[1527]: time="2025-03-17T17:59:32.553949328Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.554188 kubelet[2721]: E0317 17:59:32.554151 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.554249 kubelet[2721]: E0317 17:59:32.554207 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:32.554249 kubelet[2721]: E0317 17:59:32.554227 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:32.554382 kubelet[2721]: E0317 17:59:32.554264 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" podUID="0488d793-1538-4b9b-9365-7e94f58bafbf" Mar 17 17:59:32.556044 containerd[1527]: time="2025-03-17T17:59:32.556000566Z" level=error msg="Failed to destroy network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.556438 containerd[1527]: time="2025-03-17T17:59:32.556404154Z" level=error msg="encountered an error cleaning up failed sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.556481 containerd[1527]: time="2025-03-17T17:59:32.556466809Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.556648 kubelet[2721]: E0317 17:59:32.556626 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.556818 kubelet[2721]: E0317 17:59:32.556734 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:32.556818 kubelet[2721]: E0317 17:59:32.556756 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:32.556818 kubelet[2721]: E0317 17:59:32.556787 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-fn5nz" podUID="22ef1751-2cf9-4f31-9bc5-80f3ec9f309d" Mar 17 17:59:32.568932 containerd[1527]: time="2025-03-17T17:59:32.568884787Z" level=error msg="Failed to destroy network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.569317 containerd[1527]: time="2025-03-17T17:59:32.569279138Z" level=error msg="encountered an error cleaning up failed sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.569401 containerd[1527]: time="2025-03-17T17:59:32.569341271Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.569534 kubelet[2721]: E0317 17:59:32.569493 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:32.569623 kubelet[2721]: E0317 17:59:32.569551 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:32.569623 kubelet[2721]: E0317 17:59:32.569570 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:32.569695 kubelet[2721]: E0317 17:59:32.569626 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" podUID="805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61" Mar 17 17:59:33.163760 systemd[1]: run-netns-cni\x2d3007b559\x2dcdd1\x2dc536\x2d183a\x2d66fc9daf4487.mount: Deactivated successfully. Mar 17 17:59:33.163869 systemd[1]: run-netns-cni\x2dbbd0f636\x2d62b0\x2d93ad\x2d328e\x2d3b154b423c7a.mount: Deactivated successfully. Mar 17 17:59:33.403194 kubelet[2721]: I0317 17:59:33.403160 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc" Mar 17 17:59:33.403822 containerd[1527]: time="2025-03-17T17:59:33.403794033Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\"" Mar 17 17:59:33.404062 containerd[1527]: time="2025-03-17T17:59:33.404008116Z" level=info msg="Ensure that sandbox 574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc in task-service has been cleanup successfully" Mar 17 17:59:33.404242 containerd[1527]: time="2025-03-17T17:59:33.404202070Z" level=info msg="TearDown network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" successfully" Mar 17 17:59:33.404428 containerd[1527]: time="2025-03-17T17:59:33.404219515Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" returns successfully" Mar 17 17:59:33.404751 kubelet[2721]: I0317 17:59:33.404715 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2" Mar 17 17:59:33.405646 containerd[1527]: time="2025-03-17T17:59:33.405366361Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" Mar 17 17:59:33.405646 containerd[1527]: time="2025-03-17T17:59:33.405487219Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\"" Mar 17 17:59:33.405646 containerd[1527]: time="2025-03-17T17:59:33.405522760Z" level=info msg="TearDown network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" successfully" Mar 17 17:59:33.405646 containerd[1527]: time="2025-03-17T17:59:33.405540185Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" returns successfully" Mar 17 17:59:33.405864 containerd[1527]: time="2025-03-17T17:59:33.405656485Z" level=info msg="Ensure that sandbox dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2 in task-service has been cleanup successfully" Mar 17 17:59:33.405864 containerd[1527]: time="2025-03-17T17:59:33.405818945Z" level=info msg="TearDown network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" successfully" Mar 17 17:59:33.406352 containerd[1527]: time="2025-03-17T17:59:33.405830678Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" returns successfully" Mar 17 17:59:33.406417 containerd[1527]: time="2025-03-17T17:59:33.406362760Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" Mar 17 17:59:33.406460 containerd[1527]: time="2025-03-17T17:59:33.406444823Z" level=info msg="TearDown network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" successfully" Mar 17 17:59:33.406460 containerd[1527]: time="2025-03-17T17:59:33.406454813Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" returns successfully" Mar 17 17:59:33.406731 containerd[1527]: time="2025-03-17T17:59:33.406700929Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:59:33.406933 systemd[1]: run-netns-cni\x2da212ed09\x2d7bc8\x2d8f22\x2dd487\x2d0e61b6246ee6.mount: Deactivated successfully. Mar 17 17:59:33.407292 containerd[1527]: time="2025-03-17T17:59:33.407261226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:2,}" Mar 17 17:59:33.409908 systemd[1]: run-netns-cni\x2db888a52f\x2d0fb0\x2dfedd\x2dd397\x2d986312a67887.mount: Deactivated successfully. Mar 17 17:59:33.410798 kubelet[2721]: I0317 17:59:33.410771 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5" Mar 17 17:59:33.412722 containerd[1527]: time="2025-03-17T17:59:33.411640993Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\"" Mar 17 17:59:33.412722 containerd[1527]: time="2025-03-17T17:59:33.411828283Z" level=info msg="Ensure that sandbox c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5 in task-service has been cleanup successfully" Mar 17 17:59:33.415411 systemd[1]: run-netns-cni\x2dd8dff2a5\x2db3dd\x2dae67\x2d1341\x2d345838467ace.mount: Deactivated successfully. Mar 17 17:59:33.416433 containerd[1527]: time="2025-03-17T17:59:33.416370250Z" level=info msg="TearDown network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" successfully" Mar 17 17:59:33.416433 containerd[1527]: time="2025-03-17T17:59:33.416422815Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" returns successfully" Mar 17 17:59:33.417187 containerd[1527]: time="2025-03-17T17:59:33.417159750Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" Mar 17 17:59:33.417367 containerd[1527]: time="2025-03-17T17:59:33.417317382Z" level=info msg="TearDown network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" successfully" Mar 17 17:59:33.417414 containerd[1527]: time="2025-03-17T17:59:33.417364455Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" returns successfully" Mar 17 17:59:33.417518 kubelet[2721]: I0317 17:59:33.417460 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d" Mar 17 17:59:33.418229 kubelet[2721]: E0317 17:59:33.417569 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:33.418270 containerd[1527]: time="2025-03-17T17:59:33.417875695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:2,}" Mar 17 17:59:33.418270 containerd[1527]: time="2025-03-17T17:59:33.418092825Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\"" Mar 17 17:59:33.418496 containerd[1527]: time="2025-03-17T17:59:33.418459440Z" level=info msg="Ensure that sandbox dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d in task-service has been cleanup successfully" Mar 17 17:59:33.418898 containerd[1527]: time="2025-03-17T17:59:33.418744193Z" level=info msg="TearDown network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" successfully" Mar 17 17:59:33.418898 containerd[1527]: time="2025-03-17T17:59:33.418777188Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" returns successfully" Mar 17 17:59:33.420541 systemd[1]: run-netns-cni\x2dd538898d\x2ded5e\x2d4c7b\x2d3e8e\x2d768e2720a530.mount: Deactivated successfully. Mar 17 17:59:33.421652 containerd[1527]: time="2025-03-17T17:59:33.421620958Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" Mar 17 17:59:33.421652 containerd[1527]: time="2025-03-17T17:59:33.421726858Z" level=info msg="TearDown network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" successfully" Mar 17 17:59:33.421652 containerd[1527]: time="2025-03-17T17:59:33.421741256Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" returns successfully" Mar 17 17:59:33.421985 kubelet[2721]: I0317 17:59:33.421953 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b" Mar 17 17:59:33.422219 containerd[1527]: time="2025-03-17T17:59:33.422185825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:59:33.422637 containerd[1527]: time="2025-03-17T17:59:33.422409957Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\"" Mar 17 17:59:33.422859 containerd[1527]: time="2025-03-17T17:59:33.422841851Z" level=info msg="Ensure that sandbox 22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b in task-service has been cleanup successfully" Mar 17 17:59:33.423160 containerd[1527]: time="2025-03-17T17:59:33.423143708Z" level=info msg="TearDown network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" successfully" Mar 17 17:59:33.423202 containerd[1527]: time="2025-03-17T17:59:33.423169179Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" returns successfully" Mar 17 17:59:33.423636 kubelet[2721]: I0317 17:59:33.423362 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99" Mar 17 17:59:33.423683 containerd[1527]: time="2025-03-17T17:59:33.423410034Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" Mar 17 17:59:33.423683 containerd[1527]: time="2025-03-17T17:59:33.423479512Z" level=info msg="TearDown network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" successfully" Mar 17 17:59:33.423683 containerd[1527]: time="2025-03-17T17:59:33.423488009Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" returns successfully" Mar 17 17:59:33.423761 kubelet[2721]: E0317 17:59:33.423744 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:33.424538 containerd[1527]: time="2025-03-17T17:59:33.424474368Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\"" Mar 17 17:59:33.424538 containerd[1527]: time="2025-03-17T17:59:33.424512052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:2,}" Mar 17 17:59:33.424661 containerd[1527]: time="2025-03-17T17:59:33.424644234Z" level=info msg="Ensure that sandbox f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99 in task-service has been cleanup successfully" Mar 17 17:59:33.424802 containerd[1527]: time="2025-03-17T17:59:33.424780574Z" level=info msg="TearDown network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" successfully" Mar 17 17:59:33.424802 containerd[1527]: time="2025-03-17T17:59:33.424796374Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" returns successfully" Mar 17 17:59:33.425061 containerd[1527]: time="2025-03-17T17:59:33.425027672Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" Mar 17 17:59:33.425116 containerd[1527]: time="2025-03-17T17:59:33.425101928Z" level=info msg="TearDown network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" successfully" Mar 17 17:59:33.425138 containerd[1527]: time="2025-03-17T17:59:33.425114353Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" returns successfully" Mar 17 17:59:33.425752 containerd[1527]: time="2025-03-17T17:59:33.425547480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:2,}" Mar 17 17:59:33.876287 containerd[1527]: time="2025-03-17T17:59:33.876089900Z" level=error msg="Failed to destroy network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.877234 containerd[1527]: time="2025-03-17T17:59:33.877117772Z" level=error msg="encountered an error cleaning up failed sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.877368 containerd[1527]: time="2025-03-17T17:59:33.877347916Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.877923 kubelet[2721]: E0317 17:59:33.877868 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.878097 kubelet[2721]: E0317 17:59:33.878034 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:33.878097 kubelet[2721]: E0317 17:59:33.878058 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:33.879015 kubelet[2721]: E0317 17:59:33.878190 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-fn5nz" podUID="22ef1751-2cf9-4f31-9bc5-80f3ec9f309d" Mar 17 17:59:33.879120 containerd[1527]: time="2025-03-17T17:59:33.878711742Z" level=error msg="Failed to destroy network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.880200 containerd[1527]: time="2025-03-17T17:59:33.880045638Z" level=error msg="encountered an error cleaning up failed sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.880200 containerd[1527]: time="2025-03-17T17:59:33.880104574Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.880319 kubelet[2721]: E0317 17:59:33.880292 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.880357 kubelet[2721]: E0317 17:59:33.880344 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:33.880385 kubelet[2721]: E0317 17:59:33.880366 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:33.880515 kubelet[2721]: E0317 17:59:33.880408 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" podUID="805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61" Mar 17 17:59:33.885229 containerd[1527]: time="2025-03-17T17:59:33.885176810Z" level=error msg="Failed to destroy network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.885957 containerd[1527]: time="2025-03-17T17:59:33.885793950Z" level=error msg="encountered an error cleaning up failed sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.885957 containerd[1527]: time="2025-03-17T17:59:33.885856002Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.886120 kubelet[2721]: E0317 17:59:33.886055 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.886172 kubelet[2721]: E0317 17:59:33.886139 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:33.886207 kubelet[2721]: E0317 17:59:33.886158 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:33.886251 kubelet[2721]: E0317 17:59:33.886222 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:33.890331 containerd[1527]: time="2025-03-17T17:59:33.890289175Z" level=error msg="Failed to destroy network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.890725 containerd[1527]: time="2025-03-17T17:59:33.890698734Z" level=error msg="encountered an error cleaning up failed sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.890786 containerd[1527]: time="2025-03-17T17:59:33.890760677Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.891004 kubelet[2721]: E0317 17:59:33.890967 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.891242 kubelet[2721]: E0317 17:59:33.891117 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:33.891242 kubelet[2721]: E0317 17:59:33.891148 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:33.892304 kubelet[2721]: E0317 17:59:33.891209 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" Mar 17 17:59:33.893940 containerd[1527]: time="2025-03-17T17:59:33.893889931Z" level=error msg="Failed to destroy network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.894351 containerd[1527]: time="2025-03-17T17:59:33.894323919Z" level=error msg="encountered an error cleaning up failed sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.894406 containerd[1527]: time="2025-03-17T17:59:33.894376043Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.894668 kubelet[2721]: E0317 17:59:33.894620 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.894976 kubelet[2721]: E0317 17:59:33.894844 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:33.894976 kubelet[2721]: E0317 17:59:33.894875 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:33.894976 kubelet[2721]: E0317 17:59:33.894933 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jhsn7" podUID="b92c66d3-2179-49b1-848f-58af8f588b93" Mar 17 17:59:33.895727 containerd[1527]: time="2025-03-17T17:59:33.895685571Z" level=error msg="Failed to destroy network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.896175 containerd[1527]: time="2025-03-17T17:59:33.896143575Z" level=error msg="encountered an error cleaning up failed sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.896214 containerd[1527]: time="2025-03-17T17:59:33.896187763Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.896394 kubelet[2721]: E0317 17:59:33.896366 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:33.896454 kubelet[2721]: E0317 17:59:33.896414 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:33.896479 kubelet[2721]: E0317 17:59:33.896449 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:33.896524 kubelet[2721]: E0317 17:59:33.896497 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" podUID="0488d793-1538-4b9b-9365-7e94f58bafbf" Mar 17 17:59:34.163990 systemd[1]: run-netns-cni\x2da2301e29\x2dbd53\x2d17d5\x2d844a\x2d71246afd4eb1.mount: Deactivated successfully. Mar 17 17:59:34.164107 systemd[1]: run-netns-cni\x2d4c442acc\x2dbecd\x2d8b43\x2db204\x2d37bdb3a78398.mount: Deactivated successfully. Mar 17 17:59:34.426684 kubelet[2721]: I0317 17:59:34.426658 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08" Mar 17 17:59:34.427256 containerd[1527]: time="2025-03-17T17:59:34.427216607Z" level=info msg="StopPodSandbox for \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\"" Mar 17 17:59:34.427741 containerd[1527]: time="2025-03-17T17:59:34.427413677Z" level=info msg="Ensure that sandbox 607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08 in task-service has been cleanup successfully" Mar 17 17:59:34.429648 systemd[1]: run-netns-cni\x2dfddd9e6f\x2d1b56\x2d7684\x2df2bd\x2d72a062a5bf44.mount: Deactivated successfully. Mar 17 17:59:34.430519 containerd[1527]: time="2025-03-17T17:59:34.430468369Z" level=info msg="TearDown network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" successfully" Mar 17 17:59:34.430519 containerd[1527]: time="2025-03-17T17:59:34.430500372Z" level=info msg="StopPodSandbox for \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" returns successfully" Mar 17 17:59:34.430827 containerd[1527]: time="2025-03-17T17:59:34.430805113Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\"" Mar 17 17:59:34.430906 containerd[1527]: time="2025-03-17T17:59:34.430877927Z" level=info msg="TearDown network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" successfully" Mar 17 17:59:34.430906 containerd[1527]: time="2025-03-17T17:59:34.430886854Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" returns successfully" Mar 17 17:59:34.430953 kubelet[2721]: I0317 17:59:34.430894 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26" Mar 17 17:59:34.431324 containerd[1527]: time="2025-03-17T17:59:34.431067491Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" Mar 17 17:59:34.431324 containerd[1527]: time="2025-03-17T17:59:34.431143031Z" level=info msg="TearDown network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" successfully" Mar 17 17:59:34.431324 containerd[1527]: time="2025-03-17T17:59:34.431152950Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" returns successfully" Mar 17 17:59:34.431324 containerd[1527]: time="2025-03-17T17:59:34.431292817Z" level=info msg="StopPodSandbox for \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\"" Mar 17 17:59:34.431436 containerd[1527]: time="2025-03-17T17:59:34.431422322Z" level=info msg="Ensure that sandbox 62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26 in task-service has been cleanup successfully" Mar 17 17:59:34.431655 containerd[1527]: time="2025-03-17T17:59:34.431618088Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:59:34.431691 containerd[1527]: time="2025-03-17T17:59:34.431646124Z" level=info msg="TearDown network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" successfully" Mar 17 17:59:34.431691 containerd[1527]: time="2025-03-17T17:59:34.431664921Z" level=info msg="StopPodSandbox for \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" returns successfully" Mar 17 17:59:34.432781 containerd[1527]: time="2025-03-17T17:59:34.432756467Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\"" Mar 17 17:59:34.432885 containerd[1527]: time="2025-03-17T17:59:34.432852356Z" level=info msg="TearDown network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" successfully" Mar 17 17:59:34.432885 containerd[1527]: time="2025-03-17T17:59:34.432865141Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" returns successfully" Mar 17 17:59:34.433198 containerd[1527]: time="2025-03-17T17:59:34.433178580Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" Mar 17 17:59:34.433269 containerd[1527]: time="2025-03-17T17:59:34.433252737Z" level=info msg="TearDown network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" successfully" Mar 17 17:59:34.433269 containerd[1527]: time="2025-03-17T17:59:34.433266734Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" returns successfully" Mar 17 17:59:34.433673 containerd[1527]: time="2025-03-17T17:59:34.433651604Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:3,}" Mar 17 17:59:34.433735 systemd[1]: run-netns-cni\x2deceb46fe\x2d4af4\x2de764\x2d39cc\x2db6ebe86fe1cf.mount: Deactivated successfully. Mar 17 17:59:34.433988 kubelet[2721]: I0317 17:59:34.433963 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d" Mar 17 17:59:34.441096 containerd[1527]: time="2025-03-17T17:59:34.441061967Z" level=info msg="StopPodSandbox for \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\"" Mar 17 17:59:34.441242 containerd[1527]: time="2025-03-17T17:59:34.441219077Z" level=info msg="Ensure that sandbox 6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d in task-service has been cleanup successfully" Mar 17 17:59:34.441440 containerd[1527]: time="2025-03-17T17:59:34.441398061Z" level=info msg="TearDown network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" successfully" Mar 17 17:59:34.441440 containerd[1527]: time="2025-03-17T17:59:34.441413421Z" level=info msg="StopPodSandbox for \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" returns successfully" Mar 17 17:59:34.443418 systemd[1]: run-netns-cni\x2daae1f4b1\x2d512f\x2d8283\x2de60b\x2dc2cb028cedac.mount: Deactivated successfully. Mar 17 17:59:34.449277 containerd[1527]: time="2025-03-17T17:59:34.449251298Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\"" Mar 17 17:59:34.449356 containerd[1527]: time="2025-03-17T17:59:34.449336217Z" level=info msg="TearDown network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" successfully" Mar 17 17:59:34.449356 containerd[1527]: time="2025-03-17T17:59:34.449350815Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" returns successfully" Mar 17 17:59:34.449675 containerd[1527]: time="2025-03-17T17:59:34.449654234Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" Mar 17 17:59:34.449757 containerd[1527]: time="2025-03-17T17:59:34.449739763Z" level=info msg="TearDown network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" successfully" Mar 17 17:59:34.449757 containerd[1527]: time="2025-03-17T17:59:34.449752748Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" returns successfully" Mar 17 17:59:34.450084 kubelet[2721]: I0317 17:59:34.450052 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2" Mar 17 17:59:34.450084 kubelet[2721]: E0317 17:59:34.450068 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:34.450350 containerd[1527]: time="2025-03-17T17:59:34.450299108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:3,}" Mar 17 17:59:34.450664 containerd[1527]: time="2025-03-17T17:59:34.450642666Z" level=info msg="StopPodSandbox for \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\"" Mar 17 17:59:34.450873 containerd[1527]: time="2025-03-17T17:59:34.450853272Z" level=info msg="Ensure that sandbox 604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2 in task-service has been cleanup successfully" Mar 17 17:59:34.451897 kubelet[2721]: I0317 17:59:34.451867 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b" Mar 17 17:59:34.452320 containerd[1527]: time="2025-03-17T17:59:34.452281713Z" level=info msg="StopPodSandbox for \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\"" Mar 17 17:59:34.452433 containerd[1527]: time="2025-03-17T17:59:34.452415356Z" level=info msg="Ensure that sandbox aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b in task-service has been cleanup successfully" Mar 17 17:59:34.452811 systemd[1]: run-netns-cni\x2d3a80399d\x2da10f\x2d6596\x2d7a77\x2d464bfbb0c7fc.mount: Deactivated successfully. Mar 17 17:59:34.453212 containerd[1527]: time="2025-03-17T17:59:34.453116591Z" level=info msg="TearDown network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" successfully" Mar 17 17:59:34.453212 containerd[1527]: time="2025-03-17T17:59:34.453135839Z" level=info msg="StopPodSandbox for \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" returns successfully" Mar 17 17:59:34.453520 containerd[1527]: time="2025-03-17T17:59:34.453269773Z" level=info msg="TearDown network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" successfully" Mar 17 17:59:34.453520 containerd[1527]: time="2025-03-17T17:59:34.453300494Z" level=info msg="StopPodSandbox for \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" returns successfully" Mar 17 17:59:34.453520 containerd[1527]: time="2025-03-17T17:59:34.453363238Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\"" Mar 17 17:59:34.453520 containerd[1527]: time="2025-03-17T17:59:34.453455591Z" level=info msg="TearDown network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" successfully" Mar 17 17:59:34.453520 containerd[1527]: time="2025-03-17T17:59:34.453473917Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" returns successfully" Mar 17 17:59:34.453881 containerd[1527]: time="2025-03-17T17:59:34.453855510Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" Mar 17 17:59:34.453881 containerd[1527]: time="2025-03-17T17:59:34.453876432Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\"" Mar 17 17:59:34.453994 containerd[1527]: time="2025-03-17T17:59:34.453934577Z" level=info msg="TearDown network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" successfully" Mar 17 17:59:34.453994 containerd[1527]: time="2025-03-17T17:59:34.453945448Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" returns successfully" Mar 17 17:59:34.453994 containerd[1527]: time="2025-03-17T17:59:34.453950207Z" level=info msg="TearDown network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" successfully" Mar 17 17:59:34.453994 containerd[1527]: time="2025-03-17T17:59:34.453965808Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" returns successfully" Mar 17 17:59:34.454246 containerd[1527]: time="2025-03-17T17:59:34.454223968Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" Mar 17 17:59:34.454364 containerd[1527]: time="2025-03-17T17:59:34.454302482Z" level=info msg="TearDown network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" successfully" Mar 17 17:59:34.454364 containerd[1527]: time="2025-03-17T17:59:34.454312973Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" returns successfully" Mar 17 17:59:34.454426 kubelet[2721]: I0317 17:59:34.454352 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07" Mar 17 17:59:34.454684 containerd[1527]: time="2025-03-17T17:59:34.454660540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:59:34.454777 containerd[1527]: time="2025-03-17T17:59:34.454702402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:3,}" Mar 17 17:59:34.454777 containerd[1527]: time="2025-03-17T17:59:34.454732922Z" level=info msg="StopPodSandbox for \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\"" Mar 17 17:59:34.454903 containerd[1527]: time="2025-03-17T17:59:34.454880714Z" level=info msg="Ensure that sandbox b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07 in task-service has been cleanup successfully" Mar 17 17:59:34.455061 containerd[1527]: time="2025-03-17T17:59:34.455035030Z" level=info msg="TearDown network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" successfully" Mar 17 17:59:34.455061 containerd[1527]: time="2025-03-17T17:59:34.455051983Z" level=info msg="StopPodSandbox for \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" returns successfully" Mar 17 17:59:34.455395 containerd[1527]: time="2025-03-17T17:59:34.455276006Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\"" Mar 17 17:59:34.455395 containerd[1527]: time="2025-03-17T17:59:34.455346635Z" level=info msg="TearDown network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" successfully" Mar 17 17:59:34.455395 containerd[1527]: time="2025-03-17T17:59:34.455354400Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" returns successfully" Mar 17 17:59:34.455681 containerd[1527]: time="2025-03-17T17:59:34.455660795Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" Mar 17 17:59:34.455755 containerd[1527]: time="2025-03-17T17:59:34.455737657Z" level=info msg="TearDown network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" successfully" Mar 17 17:59:34.455755 containerd[1527]: time="2025-03-17T17:59:34.455750602Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" returns successfully" Mar 17 17:59:34.455895 kubelet[2721]: E0317 17:59:34.455873 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:34.456062 containerd[1527]: time="2025-03-17T17:59:34.456043490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:3,}" Mar 17 17:59:35.161771 systemd[1]: run-netns-cni\x2d600fb23d\x2debb2\x2d22ce\x2d083e\x2daa07a648076d.mount: Deactivated successfully. Mar 17 17:59:35.161882 systemd[1]: run-netns-cni\x2d07db1483\x2daef1\x2de1a2\x2d8dc4\x2d9c2c0a8cbe93.mount: Deactivated successfully. Mar 17 17:59:35.984472 systemd[1]: Started sshd@9-10.0.0.118:22-10.0.0.1:57328.service - OpenSSH per-connection server daemon (10.0.0.1:57328). Mar 17 17:59:36.115522 sshd[4273]: Accepted publickey for core from 10.0.0.1 port 57328 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:36.117281 sshd-session[4273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:36.126741 systemd-logind[1507]: New session 10 of user core. Mar 17 17:59:36.137805 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 17 17:59:36.804693 sshd[4275]: Connection closed by 10.0.0.1 port 57328 Mar 17 17:59:36.805088 sshd-session[4273]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:36.809871 systemd[1]: sshd@9-10.0.0.118:22-10.0.0.1:57328.service: Deactivated successfully. Mar 17 17:59:36.812329 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 17:59:36.813284 systemd-logind[1507]: Session 10 logged out. Waiting for processes to exit. Mar 17 17:59:36.814294 systemd-logind[1507]: Removed session 10. Mar 17 17:59:37.599907 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3911934867.mount: Deactivated successfully. Mar 17 17:59:41.827040 systemd[1]: Started sshd@10-10.0.0.118:22-10.0.0.1:57340.service - OpenSSH per-connection server daemon (10.0.0.1:57340). Mar 17 17:59:41.859015 containerd[1527]: time="2025-03-17T17:59:41.858933011Z" level=error msg="Failed to destroy network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.863182 containerd[1527]: time="2025-03-17T17:59:41.860597842Z" level=error msg="encountered an error cleaning up failed sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.863182 containerd[1527]: time="2025-03-17T17:59:41.860686626Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.863358 kubelet[2721]: E0317 17:59:41.860951 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.863358 kubelet[2721]: E0317 17:59:41.861021 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:41.863358 kubelet[2721]: E0317 17:59:41.861048 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jhsn7" Mar 17 17:59:41.864049 kubelet[2721]: E0317 17:59:41.861099 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jhsn7_kube-system(b92c66d3-2179-49b1-848f-58af8f588b93)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jhsn7" podUID="b92c66d3-2179-49b1-848f-58af8f588b93" Mar 17 17:59:41.876743 containerd[1527]: time="2025-03-17T17:59:41.876696369Z" level=error msg="Failed to destroy network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.878517 containerd[1527]: time="2025-03-17T17:59:41.877219987Z" level=error msg="encountered an error cleaning up failed sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.878517 containerd[1527]: time="2025-03-17T17:59:41.877296598Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.878639 kubelet[2721]: E0317 17:59:41.877636 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.878639 kubelet[2721]: E0317 17:59:41.877695 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:41.878639 kubelet[2721]: E0317 17:59:41.877720 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" Mar 17 17:59:41.878741 kubelet[2721]: E0317 17:59:41.877768 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7bbc9d96f6-tnt5q_calico-system(a88959ff-624d-4e4a-bb08-2634d2121e9c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" Mar 17 17:59:41.903465 sshd[4351]: Accepted publickey for core from 10.0.0.1 port 57340 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:41.904219 sshd-session[4351]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:41.912164 systemd-logind[1507]: New session 11 of user core. Mar 17 17:59:41.918377 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 17 17:59:41.939663 containerd[1527]: time="2025-03-17T17:59:41.939570230Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:41.941793 containerd[1527]: time="2025-03-17T17:59:41.941571532Z" level=error msg="Failed to destroy network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.943963 containerd[1527]: time="2025-03-17T17:59:41.943918974Z" level=error msg="encountered an error cleaning up failed sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.946434 containerd[1527]: time="2025-03-17T17:59:41.945724541Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.946541 kubelet[2721]: E0317 17:59:41.946005 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.946541 kubelet[2721]: E0317 17:59:41.946078 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:41.946541 kubelet[2721]: E0317 17:59:41.946104 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" Mar 17 17:59:41.947321 kubelet[2721]: E0317 17:59:41.946152 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-bfh8t_calico-apiserver(0488d793-1538-4b9b-9365-7e94f58bafbf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" podUID="0488d793-1538-4b9b-9365-7e94f58bafbf" Mar 17 17:59:41.949778 containerd[1527]: time="2025-03-17T17:59:41.949546620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 17 17:59:41.960987 containerd[1527]: time="2025-03-17T17:59:41.960333381Z" level=error msg="Failed to destroy network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.962929 containerd[1527]: time="2025-03-17T17:59:41.962899061Z" level=error msg="encountered an error cleaning up failed sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.963082 containerd[1527]: time="2025-03-17T17:59:41.963054617Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.963521 kubelet[2721]: E0317 17:59:41.963319 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.963521 kubelet[2721]: E0317 17:59:41.963376 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:41.963521 kubelet[2721]: E0317 17:59:41.963398 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kdwpf" Mar 17 17:59:41.963695 kubelet[2721]: E0317 17:59:41.963444 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kdwpf_calico-system(64b24927-936e-4c9c-ae85-ea32b09f5a34)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kdwpf" podUID="64b24927-936e-4c9c-ae85-ea32b09f5a34" Mar 17 17:59:41.971153 containerd[1527]: time="2025-03-17T17:59:41.971102228Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:41.974051 containerd[1527]: time="2025-03-17T17:59:41.972965909Z" level=error msg="Failed to destroy network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.974051 containerd[1527]: time="2025-03-17T17:59:41.973475059Z" level=error msg="encountered an error cleaning up failed sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.974051 containerd[1527]: time="2025-03-17T17:59:41.973522462Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.974234 kubelet[2721]: E0317 17:59:41.973861 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:41.974234 kubelet[2721]: E0317 17:59:41.973922 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:41.974234 kubelet[2721]: E0317 17:59:41.973942 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-fn5nz" Mar 17 17:59:41.974368 kubelet[2721]: E0317 17:59:41.974085 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-fn5nz_kube-system(22ef1751-2cf9-4f31-9bc5-80f3ec9f309d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-fn5nz" podUID="22ef1751-2cf9-4f31-9bc5-80f3ec9f309d" Mar 17 17:59:41.986601 containerd[1527]: time="2025-03-17T17:59:41.986428567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:41.990708 containerd[1527]: time="2025-03-17T17:59:41.990670872Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 9.60138685s" Mar 17 17:59:41.990770 containerd[1527]: time="2025-03-17T17:59:41.990708265Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 17 17:59:42.000075 containerd[1527]: time="2025-03-17T17:59:42.000031211Z" level=info msg="CreateContainer within sandbox \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:59:42.006638 containerd[1527]: time="2025-03-17T17:59:42.006572957Z" level=error msg="Failed to destroy network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:42.007055 containerd[1527]: time="2025-03-17T17:59:42.007017560Z" level=error msg="encountered an error cleaning up failed sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:42.007101 containerd[1527]: time="2025-03-17T17:59:42.007087186Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:42.007311 kubelet[2721]: E0317 17:59:42.007277 2721 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:59:42.007373 kubelet[2721]: E0317 17:59:42.007331 2721 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:42.007373 kubelet[2721]: E0317 17:59:42.007353 2721 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" Mar 17 17:59:42.007518 kubelet[2721]: E0317 17:59:42.007393 2721 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5df8c6645f-9hw6m_calico-apiserver(805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" podUID="805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61" Mar 17 17:59:42.071839 sshd[4446]: Connection closed by 10.0.0.1 port 57340 Mar 17 17:59:42.072216 sshd-session[4351]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:42.076448 systemd[1]: sshd@10-10.0.0.118:22-10.0.0.1:57340.service: Deactivated successfully. Mar 17 17:59:42.078729 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 17:59:42.079591 systemd-logind[1507]: Session 11 logged out. Waiting for processes to exit. Mar 17 17:59:42.080444 systemd-logind[1507]: Removed session 11. Mar 17 17:59:42.192128 containerd[1527]: time="2025-03-17T17:59:42.192065579Z" level=info msg="CreateContainer within sandbox \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\"" Mar 17 17:59:42.192643 containerd[1527]: time="2025-03-17T17:59:42.192616461Z" level=info msg="StartContainer for \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\"" Mar 17 17:59:42.273787 systemd[1]: Started cri-containerd-4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc.scope - libcontainer container 4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc. Mar 17 17:59:42.345313 containerd[1527]: time="2025-03-17T17:59:42.345216959Z" level=info msg="StartContainer for \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\" returns successfully" Mar 17 17:59:42.367395 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 17:59:42.367523 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 17:59:42.479081 kubelet[2721]: I0317 17:59:42.478893 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330" Mar 17 17:59:42.479484 containerd[1527]: time="2025-03-17T17:59:42.479436517Z" level=info msg="StopPodSandbox for \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\"" Mar 17 17:59:42.479706 containerd[1527]: time="2025-03-17T17:59:42.479680887Z" level=info msg="Ensure that sandbox 358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330 in task-service has been cleanup successfully" Mar 17 17:59:42.480026 containerd[1527]: time="2025-03-17T17:59:42.479870049Z" level=info msg="TearDown network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\" successfully" Mar 17 17:59:42.480026 containerd[1527]: time="2025-03-17T17:59:42.479890608Z" level=info msg="StopPodSandbox for \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\" returns successfully" Mar 17 17:59:42.480263 containerd[1527]: time="2025-03-17T17:59:42.480217901Z" level=info msg="StopPodSandbox for \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\"" Mar 17 17:59:42.480395 containerd[1527]: time="2025-03-17T17:59:42.480333519Z" level=info msg="TearDown network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" successfully" Mar 17 17:59:42.480395 containerd[1527]: time="2025-03-17T17:59:42.480347917Z" level=info msg="StopPodSandbox for \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" returns successfully" Mar 17 17:59:42.480806 containerd[1527]: time="2025-03-17T17:59:42.480780295Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\"" Mar 17 17:59:42.480875 containerd[1527]: time="2025-03-17T17:59:42.480858238Z" level=info msg="TearDown network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" successfully" Mar 17 17:59:42.480875 containerd[1527]: time="2025-03-17T17:59:42.480868018Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" returns successfully" Mar 17 17:59:42.481104 kubelet[2721]: I0317 17:59:42.481084 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09" Mar 17 17:59:42.481232 containerd[1527]: time="2025-03-17T17:59:42.481213536Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" Mar 17 17:59:42.481309 containerd[1527]: time="2025-03-17T17:59:42.481289746Z" level=info msg="TearDown network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" successfully" Mar 17 17:59:42.481309 containerd[1527]: time="2025-03-17T17:59:42.481305707Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" returns successfully" Mar 17 17:59:42.481650 containerd[1527]: time="2025-03-17T17:59:42.481631287Z" level=info msg="StopPodSandbox for \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\"" Mar 17 17:59:42.481710 containerd[1527]: time="2025-03-17T17:59:42.481662288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:59:42.481830 containerd[1527]: time="2025-03-17T17:59:42.481797092Z" level=info msg="Ensure that sandbox ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09 in task-service has been cleanup successfully" Mar 17 17:59:42.482016 containerd[1527]: time="2025-03-17T17:59:42.481976293Z" level=info msg="TearDown network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\" successfully" Mar 17 17:59:42.482016 containerd[1527]: time="2025-03-17T17:59:42.481998557Z" level=info msg="StopPodSandbox for \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\" returns successfully" Mar 17 17:59:42.482544 containerd[1527]: time="2025-03-17T17:59:42.482375538Z" level=info msg="StopPodSandbox for \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\"" Mar 17 17:59:42.482658 containerd[1527]: time="2025-03-17T17:59:42.482637452Z" level=info msg="TearDown network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" successfully" Mar 17 17:59:42.482724 containerd[1527]: time="2025-03-17T17:59:42.482707830Z" level=info msg="StopPodSandbox for \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" returns successfully" Mar 17 17:59:42.483125 containerd[1527]: time="2025-03-17T17:59:42.482942551Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\"" Mar 17 17:59:42.483125 containerd[1527]: time="2025-03-17T17:59:42.483059059Z" level=info msg="TearDown network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" successfully" Mar 17 17:59:42.483125 containerd[1527]: time="2025-03-17T17:59:42.483074180Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" returns successfully" Mar 17 17:59:42.483311 kubelet[2721]: I0317 17:59:42.483242 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39" Mar 17 17:59:42.483634 containerd[1527]: time="2025-03-17T17:59:42.483444307Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" Mar 17 17:59:42.483634 containerd[1527]: time="2025-03-17T17:59:42.483532729Z" level=info msg="TearDown network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" successfully" Mar 17 17:59:42.483634 containerd[1527]: time="2025-03-17T17:59:42.483545364Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" returns successfully" Mar 17 17:59:42.483727 containerd[1527]: time="2025-03-17T17:59:42.483688956Z" level=info msg="StopPodSandbox for \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\"" Mar 17 17:59:42.483770 kubelet[2721]: E0317 17:59:42.483728 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:42.483847 containerd[1527]: time="2025-03-17T17:59:42.483825174Z" level=info msg="Ensure that sandbox 4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39 in task-service has been cleanup successfully" Mar 17 17:59:42.483978 containerd[1527]: time="2025-03-17T17:59:42.483949608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:4,}" Mar 17 17:59:42.484049 containerd[1527]: time="2025-03-17T17:59:42.484025307Z" level=info msg="TearDown network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\" successfully" Mar 17 17:59:42.484049 containerd[1527]: time="2025-03-17T17:59:42.484038733Z" level=info msg="StopPodSandbox for \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\" returns successfully" Mar 17 17:59:42.484436 containerd[1527]: time="2025-03-17T17:59:42.484415083Z" level=info msg="StopPodSandbox for \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\"" Mar 17 17:59:42.484530 containerd[1527]: time="2025-03-17T17:59:42.484493427Z" level=info msg="TearDown network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" successfully" Mar 17 17:59:42.484530 containerd[1527]: time="2025-03-17T17:59:42.484502534Z" level=info msg="StopPodSandbox for \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" returns successfully" Mar 17 17:59:42.484724 containerd[1527]: time="2025-03-17T17:59:42.484699982Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\"" Mar 17 17:59:42.484807 containerd[1527]: time="2025-03-17T17:59:42.484788115Z" level=info msg="TearDown network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" successfully" Mar 17 17:59:42.484807 containerd[1527]: time="2025-03-17T17:59:42.484802303Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" returns successfully" Mar 17 17:59:42.485122 containerd[1527]: time="2025-03-17T17:59:42.485092192Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" Mar 17 17:59:42.485199 containerd[1527]: time="2025-03-17T17:59:42.485183411Z" level=info msg="TearDown network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" successfully" Mar 17 17:59:42.485231 containerd[1527]: time="2025-03-17T17:59:42.485198951Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" returns successfully" Mar 17 17:59:42.485429 kubelet[2721]: I0317 17:59:42.485396 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084" Mar 17 17:59:42.485697 containerd[1527]: time="2025-03-17T17:59:42.485658223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:59:42.485810 containerd[1527]: time="2025-03-17T17:59:42.485789060Z" level=info msg="StopPodSandbox for \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\"" Mar 17 17:59:42.486003 containerd[1527]: time="2025-03-17T17:59:42.485962491Z" level=info msg="Ensure that sandbox 9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084 in task-service has been cleanup successfully" Mar 17 17:59:42.486319 containerd[1527]: time="2025-03-17T17:59:42.486297278Z" level=info msg="TearDown network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\" successfully" Mar 17 17:59:42.486319 containerd[1527]: time="2025-03-17T17:59:42.486317297Z" level=info msg="StopPodSandbox for \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\" returns successfully" Mar 17 17:59:42.486547 containerd[1527]: time="2025-03-17T17:59:42.486524725Z" level=info msg="StopPodSandbox for \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\"" Mar 17 17:59:42.486620 containerd[1527]: time="2025-03-17T17:59:42.486608649Z" level=info msg="TearDown network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" successfully" Mar 17 17:59:42.486656 containerd[1527]: time="2025-03-17T17:59:42.486618910Z" level=info msg="StopPodSandbox for \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" returns successfully" Mar 17 17:59:42.486851 containerd[1527]: time="2025-03-17T17:59:42.486814062Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\"" Mar 17 17:59:42.486900 containerd[1527]: time="2025-03-17T17:59:42.486891234Z" level=info msg="TearDown network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" successfully" Mar 17 17:59:42.486943 kubelet[2721]: I0317 17:59:42.486869 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c" Mar 17 17:59:42.487001 containerd[1527]: time="2025-03-17T17:59:42.486900843Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" returns successfully" Mar 17 17:59:42.487223 containerd[1527]: time="2025-03-17T17:59:42.487205231Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" Mar 17 17:59:42.487293 containerd[1527]: time="2025-03-17T17:59:42.487275138Z" level=info msg="TearDown network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" successfully" Mar 17 17:59:42.487293 containerd[1527]: time="2025-03-17T17:59:42.487289486Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" returns successfully" Mar 17 17:59:42.487613 containerd[1527]: time="2025-03-17T17:59:42.487521692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:4,}" Mar 17 17:59:42.487976 containerd[1527]: time="2025-03-17T17:59:42.487940835Z" level=info msg="StopPodSandbox for \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\"" Mar 17 17:59:42.488116 containerd[1527]: time="2025-03-17T17:59:42.488097954Z" level=info msg="Ensure that sandbox 66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c in task-service has been cleanup successfully" Mar 17 17:59:42.488340 containerd[1527]: time="2025-03-17T17:59:42.488273248Z" level=info msg="TearDown network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\" successfully" Mar 17 17:59:42.488340 containerd[1527]: time="2025-03-17T17:59:42.488292686Z" level=info msg="StopPodSandbox for \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\" returns successfully" Mar 17 17:59:42.488678 containerd[1527]: time="2025-03-17T17:59:42.488660809Z" level=info msg="StopPodSandbox for \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\"" Mar 17 17:59:42.488921 kubelet[2721]: I0317 17:59:42.488889 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3" Mar 17 17:59:42.489205 containerd[1527]: time="2025-03-17T17:59:42.489187343Z" level=info msg="TearDown network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" successfully" Mar 17 17:59:42.489205 containerd[1527]: time="2025-03-17T17:59:42.489202452Z" level=info msg="StopPodSandbox for \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" returns successfully" Mar 17 17:59:42.489309 containerd[1527]: time="2025-03-17T17:59:42.489292349Z" level=info msg="StopPodSandbox for \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\"" Mar 17 17:59:42.489455 containerd[1527]: time="2025-03-17T17:59:42.489439237Z" level=info msg="Ensure that sandbox a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3 in task-service has been cleanup successfully" Mar 17 17:59:42.489783 containerd[1527]: time="2025-03-17T17:59:42.489683998Z" level=info msg="TearDown network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\" successfully" Mar 17 17:59:42.489783 containerd[1527]: time="2025-03-17T17:59:42.489703906Z" level=info msg="StopPodSandbox for \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\" returns successfully" Mar 17 17:59:42.490077 containerd[1527]: time="2025-03-17T17:59:42.489939469Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\"" Mar 17 17:59:42.490077 containerd[1527]: time="2025-03-17T17:59:42.489939479Z" level=info msg="StopPodSandbox for \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\"" Mar 17 17:59:42.490077 containerd[1527]: time="2025-03-17T17:59:42.490035167Z" level=info msg="TearDown network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" successfully" Mar 17 17:59:42.490077 containerd[1527]: time="2025-03-17T17:59:42.490044306Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" returns successfully" Mar 17 17:59:42.490186 containerd[1527]: time="2025-03-17T17:59:42.490087039Z" level=info msg="TearDown network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" successfully" Mar 17 17:59:42.490186 containerd[1527]: time="2025-03-17T17:59:42.490099363Z" level=info msg="StopPodSandbox for \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" returns successfully" Mar 17 17:59:42.490455 containerd[1527]: time="2025-03-17T17:59:42.490281060Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\"" Mar 17 17:59:42.490455 containerd[1527]: time="2025-03-17T17:59:42.490341649Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" Mar 17 17:59:42.490455 containerd[1527]: time="2025-03-17T17:59:42.490365476Z" level=info msg="TearDown network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" successfully" Mar 17 17:59:42.490455 containerd[1527]: time="2025-03-17T17:59:42.490376287Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" returns successfully" Mar 17 17:59:42.490455 containerd[1527]: time="2025-03-17T17:59:42.490409823Z" level=info msg="TearDown network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" successfully" Mar 17 17:59:42.490455 containerd[1527]: time="2025-03-17T17:59:42.490418580Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" returns successfully" Mar 17 17:59:42.490615 kubelet[2721]: E0317 17:59:42.490547 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:42.490655 containerd[1527]: time="2025-03-17T17:59:42.490610096Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" Mar 17 17:59:42.490708 containerd[1527]: time="2025-03-17T17:59:42.490690284Z" level=info msg="TearDown network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" successfully" Mar 17 17:59:42.490708 containerd[1527]: time="2025-03-17T17:59:42.490705203Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" returns successfully" Mar 17 17:59:42.490910 containerd[1527]: time="2025-03-17T17:59:42.490882812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:4,}" Mar 17 17:59:42.491216 containerd[1527]: time="2025-03-17T17:59:42.491198512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:4,}" Mar 17 17:59:42.492136 kubelet[2721]: E0317 17:59:42.492080 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:42.726164 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330-shm.mount: Deactivated successfully. Mar 17 17:59:42.726303 systemd[1]: run-netns-cni\x2d1f878c10\x2dcc3e\x2de83a\x2dd28a\x2d90fb553512e9.mount: Deactivated successfully. Mar 17 17:59:42.726403 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084-shm.mount: Deactivated successfully. Mar 17 17:59:42.726497 systemd[1]: run-netns-cni\x2df3bf40de\x2dead4\x2d4fbe\x2da596\x2dbe6faceb7f05.mount: Deactivated successfully. Mar 17 17:59:42.726612 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c-shm.mount: Deactivated successfully. Mar 17 17:59:43.494385 kubelet[2721]: E0317 17:59:43.494339 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:43.866079 systemd-networkd[1459]: cali0891e75bbcf: Link UP Mar 17 17:59:43.866271 systemd-networkd[1459]: cali0891e75bbcf: Gained carrier Mar 17 17:59:43.877979 kubelet[2721]: I0317 17:59:43.876700 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-r9mrl" podStartSLOduration=2.396729158 podStartE2EDuration="28.876678393s" podCreationTimestamp="2025-03-17 17:59:15 +0000 UTC" firstStartedPulling="2025-03-17 17:59:15.511426109 +0000 UTC m=+24.302269376" lastFinishedPulling="2025-03-17 17:59:41.991375345 +0000 UTC m=+50.782218611" observedRunningTime="2025-03-17 17:59:42.710699058 +0000 UTC m=+51.501542324" watchObservedRunningTime="2025-03-17 17:59:43.876678393 +0000 UTC m=+52.667521659" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.324 [INFO][4684] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.341 [INFO][4684] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0 coredns-7db6d8ff4d- kube-system b92c66d3-2179-49b1-848f-58af8f588b93 798 0 2025-03-17 17:59:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-jhsn7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0891e75bbcf [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.341 [INFO][4684] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.419 [INFO][4733] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" HandleID="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Workload="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4733] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" HandleID="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Workload="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ceca0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-jhsn7", "timestamp":"2025-03-17 17:59:43.419517899 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.629 [INFO][4733] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.629 [INFO][4733] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.629 [INFO][4733] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.680 [INFO][4733] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.685 [INFO][4733] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.689 [INFO][4733] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.779 [INFO][4733] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.781 [INFO][4733] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.781 [INFO][4733] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.782 [INFO][4733] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.832 [INFO][4733] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.853 [INFO][4733] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.853 [INFO][4733] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" host="localhost" Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.853 [INFO][4733] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:59:43.884095 containerd[1527]: 2025-03-17 17:59:43.853 [INFO][4733] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" HandleID="k8s-pod-network.7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Workload="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" Mar 17 17:59:43.885311 containerd[1527]: 2025-03-17 17:59:43.857 [INFO][4684] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"b92c66d3-2179-49b1-848f-58af8f588b93", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-jhsn7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0891e75bbcf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:43.885311 containerd[1527]: 2025-03-17 17:59:43.857 [INFO][4684] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" Mar 17 17:59:43.885311 containerd[1527]: 2025-03-17 17:59:43.857 [INFO][4684] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0891e75bbcf ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" Mar 17 17:59:43.885311 containerd[1527]: 2025-03-17 17:59:43.866 [INFO][4684] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" Mar 17 17:59:43.885311 containerd[1527]: 2025-03-17 17:59:43.867 [INFO][4684] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"b92c66d3-2179-49b1-848f-58af8f588b93", ResourceVersion:"798", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce", Pod:"coredns-7db6d8ff4d-jhsn7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0891e75bbcf", MAC:"12:3f:93:eb:fd:11", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:43.885311 containerd[1527]: 2025-03-17 17:59:43.877 [INFO][4684] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jhsn7" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jhsn7-eth0" Mar 17 17:59:43.893982 systemd-networkd[1459]: cali4a9dfd194d5: Link UP Mar 17 17:59:43.894539 systemd-networkd[1459]: cali4a9dfd194d5: Gained carrier Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.330 [INFO][4623] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.342 [INFO][4623] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0 calico-kube-controllers-7bbc9d96f6- calico-system a88959ff-624d-4e4a-bb08-2634d2121e9c 801 0 2025-03-17 17:59:15 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7bbc9d96f6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7bbc9d96f6-tnt5q eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4a9dfd194d5 [] []}} ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.342 [INFO][4623] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.419 [INFO][4734] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" HandleID="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Workload="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4734] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" HandleID="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Workload="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e2bd0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7bbc9d96f6-tnt5q", "timestamp":"2025-03-17 17:59:43.419540944 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4734] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.853 [INFO][4734] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.854 [INFO][4734] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.857 [INFO][4734] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.861 [INFO][4734] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.866 [INFO][4734] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.868 [INFO][4734] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.870 [INFO][4734] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.870 [INFO][4734] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.872 [INFO][4734] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699 Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.878 [INFO][4734] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.886 [INFO][4734] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.886 [INFO][4734] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" host="localhost" Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.886 [INFO][4734] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:59:43.910381 containerd[1527]: 2025-03-17 17:59:43.886 [INFO][4734] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" HandleID="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Workload="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 17:59:43.911197 containerd[1527]: 2025-03-17 17:59:43.890 [INFO][4623] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0", GenerateName:"calico-kube-controllers-7bbc9d96f6-", Namespace:"calico-system", SelfLink:"", UID:"a88959ff-624d-4e4a-bb08-2634d2121e9c", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7bbc9d96f6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7bbc9d96f6-tnt5q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4a9dfd194d5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:43.911197 containerd[1527]: 2025-03-17 17:59:43.890 [INFO][4623] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 17:59:43.911197 containerd[1527]: 2025-03-17 17:59:43.890 [INFO][4623] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a9dfd194d5 ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 17:59:43.911197 containerd[1527]: 2025-03-17 17:59:43.895 [INFO][4623] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 17:59:43.911197 containerd[1527]: 2025-03-17 17:59:43.896 [INFO][4623] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0", GenerateName:"calico-kube-controllers-7bbc9d96f6-", Namespace:"calico-system", SelfLink:"", UID:"a88959ff-624d-4e4a-bb08-2634d2121e9c", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7bbc9d96f6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699", Pod:"calico-kube-controllers-7bbc9d96f6-tnt5q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4a9dfd194d5", MAC:"26:42:8a:3f:69:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:43.911197 containerd[1527]: 2025-03-17 17:59:43.907 [INFO][4623] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Namespace="calico-system" Pod="calico-kube-controllers-7bbc9d96f6-tnt5q" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 17:59:43.937038 systemd-networkd[1459]: cali91b2dbbb852: Link UP Mar 17 17:59:43.937304 systemd-networkd[1459]: cali91b2dbbb852: Gained carrier Mar 17 17:59:43.947175 containerd[1527]: time="2025-03-17T17:59:43.947008453Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:43.947333 containerd[1527]: time="2025-03-17T17:59:43.947310847Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:43.947443 containerd[1527]: time="2025-03-17T17:59:43.947422515Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:43.950295 containerd[1527]: time="2025-03-17T17:59:43.950166110Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.307 [INFO][4650] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.320 [INFO][4650] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--kdwpf-eth0 csi-node-driver- calico-system 64b24927-936e-4c9c-ae85-ea32b09f5a34 611 0 2025-03-17 17:59:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-kdwpf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali91b2dbbb852 [] []}} ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.322 [INFO][4650] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-eth0" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.418 [INFO][4716] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" HandleID="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Workload="localhost-k8s-csi--node--driver--kdwpf-eth0" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4716] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" HandleID="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Workload="localhost-k8s-csi--node--driver--kdwpf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000140040), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-kdwpf", "timestamp":"2025-03-17 17:59:43.418080127 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4716] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.887 [INFO][4716] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.887 [INFO][4716] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.889 [INFO][4716] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.900 [INFO][4716] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.913 [INFO][4716] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.917 [INFO][4716] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.919 [INFO][4716] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.919 [INFO][4716] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.920 [INFO][4716] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.924 [INFO][4716] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.928 [INFO][4716] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.928 [INFO][4716] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" host="localhost" Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.928 [INFO][4716] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:59:43.955650 containerd[1527]: 2025-03-17 17:59:43.929 [INFO][4716] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" HandleID="k8s-pod-network.45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Workload="localhost-k8s-csi--node--driver--kdwpf-eth0" Mar 17 17:59:43.956214 containerd[1527]: 2025-03-17 17:59:43.932 [INFO][4650] cni-plugin/k8s.go 386: Populated endpoint ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kdwpf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"64b24927-936e-4c9c-ae85-ea32b09f5a34", ResourceVersion:"611", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-kdwpf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali91b2dbbb852", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:43.956214 containerd[1527]: 2025-03-17 17:59:43.932 [INFO][4650] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-eth0" Mar 17 17:59:43.956214 containerd[1527]: 2025-03-17 17:59:43.932 [INFO][4650] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91b2dbbb852 ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-eth0" Mar 17 17:59:43.956214 containerd[1527]: 2025-03-17 17:59:43.937 [INFO][4650] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-eth0" Mar 17 17:59:43.956214 containerd[1527]: 2025-03-17 17:59:43.938 [INFO][4650] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kdwpf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"64b24927-936e-4c9c-ae85-ea32b09f5a34", ResourceVersion:"611", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f", Pod:"csi-node-driver-kdwpf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali91b2dbbb852", MAC:"3a:7f:88:95:21:d0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:43.956214 containerd[1527]: 2025-03-17 17:59:43.950 [INFO][4650] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f" Namespace="calico-system" Pod="csi-node-driver-kdwpf" WorkloadEndpoint="localhost-k8s-csi--node--driver--kdwpf-eth0" Mar 17 17:59:43.957504 containerd[1527]: time="2025-03-17T17:59:43.956799668Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:43.957504 containerd[1527]: time="2025-03-17T17:59:43.956876609Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:43.957504 containerd[1527]: time="2025-03-17T17:59:43.956892369Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:43.957504 containerd[1527]: time="2025-03-17T17:59:43.957010721Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:43.980805 systemd-networkd[1459]: calif931b687989: Link UP Mar 17 17:59:43.980828 systemd[1]: Started cri-containerd-9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699.scope - libcontainer container 9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699. Mar 17 17:59:43.981629 systemd-networkd[1459]: calif931b687989: Gained carrier Mar 17 17:59:44.003270 containerd[1527]: time="2025-03-17T17:59:44.003023587Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:44.003270 containerd[1527]: time="2025-03-17T17:59:44.003088525Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:44.003270 containerd[1527]: time="2025-03-17T17:59:44.003099446Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.003270 containerd[1527]: time="2025-03-17T17:59:44.003190455Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.003997 systemd[1]: Started cri-containerd-7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce.scope - libcontainer container 7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce. Mar 17 17:59:44.008032 systemd-resolved[1373]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:59:44.018559 systemd-resolved[1373]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:59:44.026721 systemd[1]: Started cri-containerd-45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f.scope - libcontainer container 45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f. Mar 17 17:59:44.046270 systemd-resolved[1373]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:59:44.047861 containerd[1527]: time="2025-03-17T17:59:44.047823822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bbc9d96f6-tnt5q,Uid:a88959ff-624d-4e4a-bb08-2634d2121e9c,Namespace:calico-system,Attempt:4,} returns sandbox id \"9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699\"" Mar 17 17:59:44.049596 containerd[1527]: time="2025-03-17T17:59:44.049492906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 17 17:59:44.056006 containerd[1527]: time="2025-03-17T17:59:44.055969249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jhsn7,Uid:b92c66d3-2179-49b1-848f-58af8f588b93,Namespace:kube-system,Attempt:4,} returns sandbox id \"7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce\"" Mar 17 17:59:44.058112 kubelet[2721]: E0317 17:59:44.056658 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:44.058452 containerd[1527]: time="2025-03-17T17:59:44.058429866Z" level=info msg="CreateContainer within sandbox \"7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:59:44.061620 containerd[1527]: time="2025-03-17T17:59:44.061536399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kdwpf,Uid:64b24927-936e-4c9c-ae85-ea32b09f5a34,Namespace:calico-system,Attempt:4,} returns sandbox id \"45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f\"" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.289 [INFO][4619] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.322 [INFO][4619] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0 coredns-7db6d8ff4d- kube-system 22ef1751-2cf9-4f31-9bc5-80f3ec9f309d 805 0 2025-03-17 17:59:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-fn5nz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif931b687989 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.322 [INFO][4619] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.407 [INFO][4715] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" HandleID="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Workload="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4715] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" HandleID="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Workload="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000528ae0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-fn5nz", "timestamp":"2025-03-17 17:59:43.407331825 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4715] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.928 [INFO][4715] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.928 [INFO][4715] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.932 [INFO][4715] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.938 [INFO][4715] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.942 [INFO][4715] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.944 [INFO][4715] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.950 [INFO][4715] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.950 [INFO][4715] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.952 [INFO][4715] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6 Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.957 [INFO][4715] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.966 [INFO][4715] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.967 [INFO][4715] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" host="localhost" Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.967 [INFO][4715] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:59:44.234034 containerd[1527]: 2025-03-17 17:59:43.967 [INFO][4715] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" HandleID="k8s-pod-network.18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Workload="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" Mar 17 17:59:44.234630 containerd[1527]: 2025-03-17 17:59:43.976 [INFO][4619] cni-plugin/k8s.go 386: Populated endpoint ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"22ef1751-2cf9-4f31-9bc5-80f3ec9f309d", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-fn5nz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif931b687989", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:44.234630 containerd[1527]: 2025-03-17 17:59:43.976 [INFO][4619] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" Mar 17 17:59:44.234630 containerd[1527]: 2025-03-17 17:59:43.976 [INFO][4619] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif931b687989 ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" Mar 17 17:59:44.234630 containerd[1527]: 2025-03-17 17:59:43.978 [INFO][4619] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" Mar 17 17:59:44.234630 containerd[1527]: 2025-03-17 17:59:43.978 [INFO][4619] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"22ef1751-2cf9-4f31-9bc5-80f3ec9f309d", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6", Pod:"coredns-7db6d8ff4d-fn5nz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif931b687989", MAC:"26:ae:2f:85:e3:a3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:44.234630 containerd[1527]: 2025-03-17 17:59:44.232 [INFO][4619] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6" Namespace="kube-system" Pod="coredns-7db6d8ff4d-fn5nz" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--fn5nz-eth0" Mar 17 17:59:44.430294 systemd-networkd[1459]: cali68368f3b3bc: Link UP Mar 17 17:59:44.431386 systemd-networkd[1459]: cali68368f3b3bc: Gained carrier Mar 17 17:59:44.439232 containerd[1527]: time="2025-03-17T17:59:44.439128529Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:44.439232 containerd[1527]: time="2025-03-17T17:59:44.439204137Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:44.439410 containerd[1527]: time="2025-03-17T17:59:44.439224527Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.439410 containerd[1527]: time="2025-03-17T17:59:44.439330565Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.452883 containerd[1527]: time="2025-03-17T17:59:44.452840885Z" level=info msg="CreateContainer within sandbox \"7dcb152706ec1338cc6eebf4d337978d7ae8b8e4becaf1c75c2de7ec5e0229ce\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1e69e5f6d603df1853d7dc72495b2a0ebe20e3540dc38ba6711efb7463a8ac95\"" Mar 17 17:59:44.457166 containerd[1527]: time="2025-03-17T17:59:44.456793398Z" level=info msg="StartContainer for \"1e69e5f6d603df1853d7dc72495b2a0ebe20e3540dc38ba6711efb7463a8ac95\"" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.340 [INFO][4647] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.350 [INFO][4647] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0 calico-apiserver-5df8c6645f- calico-apiserver 0488d793-1538-4b9b-9365-7e94f58bafbf 803 0 2025-03-17 17:59:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5df8c6645f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5df8c6645f-bfh8t eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali68368f3b3bc [] []}} ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.350 [INFO][4647] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.407 [INFO][4744] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" HandleID="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Workload="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.628 [INFO][4744] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" HandleID="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Workload="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030e7f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5df8c6645f-bfh8t", "timestamp":"2025-03-17 17:59:43.407451078 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.629 [INFO][4744] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.967 [INFO][4744] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.967 [INFO][4744] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.971 [INFO][4744] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:43.979 [INFO][4744] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.393 [INFO][4744] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.397 [INFO][4744] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.400 [INFO][4744] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.400 [INFO][4744] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.402 [INFO][4744] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.410 [INFO][4744] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.417 [INFO][4744] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.417 [INFO][4744] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" host="localhost" Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.418 [INFO][4744] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:59:44.458962 containerd[1527]: 2025-03-17 17:59:44.418 [INFO][4744] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" HandleID="k8s-pod-network.37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Workload="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" Mar 17 17:59:44.459703 containerd[1527]: 2025-03-17 17:59:44.425 [INFO][4647] cni-plugin/k8s.go 386: Populated endpoint ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0", GenerateName:"calico-apiserver-5df8c6645f-", Namespace:"calico-apiserver", SelfLink:"", UID:"0488d793-1538-4b9b-9365-7e94f58bafbf", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5df8c6645f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5df8c6645f-bfh8t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68368f3b3bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:44.459703 containerd[1527]: 2025-03-17 17:59:44.425 [INFO][4647] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" Mar 17 17:59:44.459703 containerd[1527]: 2025-03-17 17:59:44.425 [INFO][4647] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68368f3b3bc ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" Mar 17 17:59:44.459703 containerd[1527]: 2025-03-17 17:59:44.430 [INFO][4647] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" Mar 17 17:59:44.459703 containerd[1527]: 2025-03-17 17:59:44.432 [INFO][4647] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0", GenerateName:"calico-apiserver-5df8c6645f-", Namespace:"calico-apiserver", SelfLink:"", UID:"0488d793-1538-4b9b-9365-7e94f58bafbf", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5df8c6645f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df", Pod:"calico-apiserver-5df8c6645f-bfh8t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68368f3b3bc", MAC:"c2:2b:29:95:77:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:44.459703 containerd[1527]: 2025-03-17 17:59:44.445 [INFO][4647] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-bfh8t" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--bfh8t-eth0" Mar 17 17:59:44.491739 systemd[1]: Started cri-containerd-18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6.scope - libcontainer container 18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6. Mar 17 17:59:44.515746 systemd-networkd[1459]: cali19347291283: Link UP Mar 17 17:59:44.516935 systemd-networkd[1459]: cali19347291283: Gained carrier Mar 17 17:59:44.518759 systemd[1]: Started cri-containerd-1e69e5f6d603df1853d7dc72495b2a0ebe20e3540dc38ba6711efb7463a8ac95.scope - libcontainer container 1e69e5f6d603df1853d7dc72495b2a0ebe20e3540dc38ba6711efb7463a8ac95. Mar 17 17:59:44.531856 containerd[1527]: time="2025-03-17T17:59:44.531682932Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:44.531856 containerd[1527]: time="2025-03-17T17:59:44.531827396Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:44.531981 containerd[1527]: time="2025-03-17T17:59:44.531858066Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.532338 containerd[1527]: time="2025-03-17T17:59:44.532008431Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.532828 systemd-resolved[1373]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:59:44.547597 kernel: bpftool[5181]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 17:59:44.555161 systemd[1]: Started cri-containerd-37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df.scope - libcontainer container 37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df. Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:43.294 [INFO][4655] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:43.321 [INFO][4655] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0 calico-apiserver-5df8c6645f- calico-apiserver 805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61 804 0 2025-03-17 17:59:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5df8c6645f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5df8c6645f-9hw6m eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali19347291283 [] []}} ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:43.321 [INFO][4655] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:43.407 [INFO][4712] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" HandleID="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Workload="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:43.629 [INFO][4712] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" HandleID="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Workload="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000280720), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5df8c6645f-9hw6m", "timestamp":"2025-03-17 17:59:43.407303559 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:43.629 [INFO][4712] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.420 [INFO][4712] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.420 [INFO][4712] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.424 [INFO][4712] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.434 [INFO][4712] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.447 [INFO][4712] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.460 [INFO][4712] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.462 [INFO][4712] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.462 [INFO][4712] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.465 [INFO][4712] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942 Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.475 [INFO][4712] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.486 [INFO][4712] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.486 [INFO][4712] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" host="localhost" Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.487 [INFO][4712] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:59:44.560533 containerd[1527]: 2025-03-17 17:59:44.487 [INFO][4712] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" HandleID="k8s-pod-network.c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Workload="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" Mar 17 17:59:44.561314 containerd[1527]: 2025-03-17 17:59:44.500 [INFO][4655] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0", GenerateName:"calico-apiserver-5df8c6645f-", Namespace:"calico-apiserver", SelfLink:"", UID:"805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5df8c6645f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5df8c6645f-9hw6m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali19347291283", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:44.561314 containerd[1527]: 2025-03-17 17:59:44.506 [INFO][4655] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" Mar 17 17:59:44.561314 containerd[1527]: 2025-03-17 17:59:44.507 [INFO][4655] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali19347291283 ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" Mar 17 17:59:44.561314 containerd[1527]: 2025-03-17 17:59:44.517 [INFO][4655] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" Mar 17 17:59:44.561314 containerd[1527]: 2025-03-17 17:59:44.517 [INFO][4655] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0", GenerateName:"calico-apiserver-5df8c6645f-", Namespace:"calico-apiserver", SelfLink:"", UID:"805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 59, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5df8c6645f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942", Pod:"calico-apiserver-5df8c6645f-9hw6m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali19347291283", MAC:"d6:c5:48:e6:38:35", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:59:44.561314 containerd[1527]: 2025-03-17 17:59:44.549 [INFO][4655] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942" Namespace="calico-apiserver" Pod="calico-apiserver-5df8c6645f-9hw6m" WorkloadEndpoint="localhost-k8s-calico--apiserver--5df8c6645f--9hw6m-eth0" Mar 17 17:59:44.586790 containerd[1527]: time="2025-03-17T17:59:44.586542018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-fn5nz,Uid:22ef1751-2cf9-4f31-9bc5-80f3ec9f309d,Namespace:kube-system,Attempt:4,} returns sandbox id \"18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6\"" Mar 17 17:59:44.587655 kubelet[2721]: E0317 17:59:44.587270 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:44.591017 containerd[1527]: time="2025-03-17T17:59:44.590971936Z" level=info msg="CreateContainer within sandbox \"18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:59:44.603327 systemd-resolved[1373]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:59:44.609334 containerd[1527]: time="2025-03-17T17:59:44.604083926Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:59:44.609334 containerd[1527]: time="2025-03-17T17:59:44.604152900Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:59:44.609334 containerd[1527]: time="2025-03-17T17:59:44.604165306Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.609334 containerd[1527]: time="2025-03-17T17:59:44.604253318Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:59:44.609334 containerd[1527]: time="2025-03-17T17:59:44.609231362Z" level=info msg="StartContainer for \"1e69e5f6d603df1853d7dc72495b2a0ebe20e3540dc38ba6711efb7463a8ac95\" returns successfully" Mar 17 17:59:44.628828 systemd[1]: Started cri-containerd-c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942.scope - libcontainer container c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942. Mar 17 17:59:44.644935 containerd[1527]: time="2025-03-17T17:59:44.644478278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-bfh8t,Uid:0488d793-1538-4b9b-9365-7e94f58bafbf,Namespace:calico-apiserver,Attempt:4,} returns sandbox id \"37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df\"" Mar 17 17:59:44.659924 systemd-resolved[1373]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:59:44.668297 containerd[1527]: time="2025-03-17T17:59:44.668246800Z" level=info msg="CreateContainer within sandbox \"18047bf127dacd9565a85026aa71826e24c2024f39aebab3d6c9737d6e47afa6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7ead0b1d57ded737cba3a6cb46e7b4380b9a5daa8671358157ba3e1a3fb902eb\"" Mar 17 17:59:44.669057 containerd[1527]: time="2025-03-17T17:59:44.669019546Z" level=info msg="StartContainer for \"7ead0b1d57ded737cba3a6cb46e7b4380b9a5daa8671358157ba3e1a3fb902eb\"" Mar 17 17:59:44.707804 systemd[1]: Started cri-containerd-7ead0b1d57ded737cba3a6cb46e7b4380b9a5daa8671358157ba3e1a3fb902eb.scope - libcontainer container 7ead0b1d57ded737cba3a6cb46e7b4380b9a5daa8671358157ba3e1a3fb902eb. Mar 17 17:59:44.750723 containerd[1527]: time="2025-03-17T17:59:44.749968335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5df8c6645f-9hw6m,Uid:805bc4c9-fd3d-4dd2-8a5a-24bf9e34eb61,Namespace:calico-apiserver,Attempt:4,} returns sandbox id \"c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942\"" Mar 17 17:59:44.914063 systemd-networkd[1459]: vxlan.calico: Link UP Mar 17 17:59:44.914074 systemd-networkd[1459]: vxlan.calico: Gained carrier Mar 17 17:59:44.929063 containerd[1527]: time="2025-03-17T17:59:44.929023758Z" level=info msg="StartContainer for \"7ead0b1d57ded737cba3a6cb46e7b4380b9a5daa8671358157ba3e1a3fb902eb\" returns successfully" Mar 17 17:59:45.082798 systemd-networkd[1459]: cali0891e75bbcf: Gained IPv6LL Mar 17 17:59:45.339016 systemd-networkd[1459]: calif931b687989: Gained IPv6LL Mar 17 17:59:45.402720 systemd-networkd[1459]: cali91b2dbbb852: Gained IPv6LL Mar 17 17:59:45.466797 systemd-networkd[1459]: cali4a9dfd194d5: Gained IPv6LL Mar 17 17:59:45.535498 kubelet[2721]: E0317 17:59:45.535456 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:45.543695 kubelet[2721]: E0317 17:59:45.543324 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:45.546121 kubelet[2721]: I0317 17:59:45.546078 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-fn5nz" podStartSLOduration=36.546066171 podStartE2EDuration="36.546066171s" podCreationTimestamp="2025-03-17 17:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:59:45.545727146 +0000 UTC m=+54.336570412" watchObservedRunningTime="2025-03-17 17:59:45.546066171 +0000 UTC m=+54.336909437" Mar 17 17:59:45.569684 kubelet[2721]: I0317 17:59:45.569622 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-jhsn7" podStartSLOduration=36.569605541 podStartE2EDuration="36.569605541s" podCreationTimestamp="2025-03-17 17:59:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:59:45.559558614 +0000 UTC m=+54.350401900" watchObservedRunningTime="2025-03-17 17:59:45.569605541 +0000 UTC m=+54.360448807" Mar 17 17:59:45.595360 systemd-networkd[1459]: cali19347291283: Gained IPv6LL Mar 17 17:59:45.786742 systemd-networkd[1459]: cali68368f3b3bc: Gained IPv6LL Mar 17 17:59:46.490756 systemd-networkd[1459]: vxlan.calico: Gained IPv6LL Mar 17 17:59:46.545358 kubelet[2721]: E0317 17:59:46.545323 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:46.545782 kubelet[2721]: E0317 17:59:46.545457 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:47.061886 containerd[1527]: time="2025-03-17T17:59:47.061821966Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:47.062572 containerd[1527]: time="2025-03-17T17:59:47.062513451Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 17 17:59:47.063658 containerd[1527]: time="2025-03-17T17:59:47.063628455Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:47.065794 containerd[1527]: time="2025-03-17T17:59:47.065759369Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:47.066395 containerd[1527]: time="2025-03-17T17:59:47.066355778Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 3.01683635s" Mar 17 17:59:47.066442 containerd[1527]: time="2025-03-17T17:59:47.066393892Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 17 17:59:47.067373 containerd[1527]: time="2025-03-17T17:59:47.067185303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 17:59:47.073435 containerd[1527]: time="2025-03-17T17:59:47.073402763Z" level=info msg="CreateContainer within sandbox \"9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 17:59:47.090008 containerd[1527]: time="2025-03-17T17:59:47.089973273Z" level=info msg="CreateContainer within sandbox \"9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86\"" Mar 17 17:59:47.090453 containerd[1527]: time="2025-03-17T17:59:47.090428664Z" level=info msg="StartContainer for \"45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86\"" Mar 17 17:59:47.095880 systemd[1]: Started sshd@11-10.0.0.118:22-10.0.0.1:58798.service - OpenSSH per-connection server daemon (10.0.0.1:58798). Mar 17 17:59:47.130724 systemd[1]: Started cri-containerd-45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86.scope - libcontainer container 45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86. Mar 17 17:59:47.147964 sshd[5402]: Accepted publickey for core from 10.0.0.1 port 58798 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:47.149932 sshd-session[5402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:47.154256 systemd-logind[1507]: New session 12 of user core. Mar 17 17:59:47.162831 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 17 17:59:47.178535 containerd[1527]: time="2025-03-17T17:59:47.178456755Z" level=info msg="StartContainer for \"45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86\" returns successfully" Mar 17 17:59:47.297408 sshd[5429]: Connection closed by 10.0.0.1 port 58798 Mar 17 17:59:47.297876 sshd-session[5402]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:47.310321 systemd[1]: sshd@11-10.0.0.118:22-10.0.0.1:58798.service: Deactivated successfully. Mar 17 17:59:47.312156 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 17:59:47.313615 systemd-logind[1507]: Session 12 logged out. Waiting for processes to exit. Mar 17 17:59:47.318920 systemd[1]: Started sshd@12-10.0.0.118:22-10.0.0.1:58808.service - OpenSSH per-connection server daemon (10.0.0.1:58808). Mar 17 17:59:47.319825 systemd-logind[1507]: Removed session 12. Mar 17 17:59:47.356042 sshd[5453]: Accepted publickey for core from 10.0.0.1 port 58808 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:47.357448 sshd-session[5453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:47.362967 systemd-logind[1507]: New session 13 of user core. Mar 17 17:59:47.372722 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 17 17:59:47.550036 kubelet[2721]: E0317 17:59:47.550007 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:47.550036 kubelet[2721]: E0317 17:59:47.550056 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:59:47.624347 kubelet[2721]: I0317 17:59:47.618881 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7bbc9d96f6-tnt5q" podStartSLOduration=29.600970747 podStartE2EDuration="32.618865922s" podCreationTimestamp="2025-03-17 17:59:15 +0000 UTC" firstStartedPulling="2025-03-17 17:59:44.049110135 +0000 UTC m=+52.839953401" lastFinishedPulling="2025-03-17 17:59:47.06700531 +0000 UTC m=+55.857848576" observedRunningTime="2025-03-17 17:59:47.618389338 +0000 UTC m=+56.409232604" watchObservedRunningTime="2025-03-17 17:59:47.618865922 +0000 UTC m=+56.409709188" Mar 17 17:59:47.638337 sshd[5456]: Connection closed by 10.0.0.1 port 58808 Mar 17 17:59:47.638788 sshd-session[5453]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:47.652898 systemd[1]: sshd@12-10.0.0.118:22-10.0.0.1:58808.service: Deactivated successfully. Mar 17 17:59:47.655642 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 17:59:47.657560 systemd-logind[1507]: Session 13 logged out. Waiting for processes to exit. Mar 17 17:59:47.672932 systemd[1]: Started sshd@13-10.0.0.118:22-10.0.0.1:58814.service - OpenSSH per-connection server daemon (10.0.0.1:58814). Mar 17 17:59:47.675967 systemd-logind[1507]: Removed session 13. Mar 17 17:59:47.875730 sshd[5466]: Accepted publickey for core from 10.0.0.1 port 58814 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:47.876274 sshd-session[5466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:47.887622 systemd-logind[1507]: New session 14 of user core. Mar 17 17:59:47.893597 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 17 17:59:48.028220 sshd[5469]: Connection closed by 10.0.0.1 port 58814 Mar 17 17:59:48.026998 sshd-session[5466]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:48.031833 systemd[1]: sshd@13-10.0.0.118:22-10.0.0.1:58814.service: Deactivated successfully. Mar 17 17:59:48.035017 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 17:59:48.036233 systemd-logind[1507]: Session 14 logged out. Waiting for processes to exit. Mar 17 17:59:48.037294 systemd-logind[1507]: Removed session 14. Mar 17 17:59:49.517065 containerd[1527]: time="2025-03-17T17:59:49.516994681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:49.573668 containerd[1527]: time="2025-03-17T17:59:49.573606999Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 17 17:59:49.627592 containerd[1527]: time="2025-03-17T17:59:49.627519982Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:49.694863 containerd[1527]: time="2025-03-17T17:59:49.694765263Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:49.695528 containerd[1527]: time="2025-03-17T17:59:49.695496785Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.628283566s" Mar 17 17:59:49.695528 containerd[1527]: time="2025-03-17T17:59:49.695522956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 17 17:59:49.696644 containerd[1527]: time="2025-03-17T17:59:49.696599163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:59:49.697831 containerd[1527]: time="2025-03-17T17:59:49.697804021Z" level=info msg="CreateContainer within sandbox \"45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 17:59:49.934968 containerd[1527]: time="2025-03-17T17:59:49.934903817Z" level=info msg="CreateContainer within sandbox \"45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a68c87573a20d4306a3417eee669a7db9ebc0b17a0c43f950ee0d82adfeb57f4\"" Mar 17 17:59:49.935466 containerd[1527]: time="2025-03-17T17:59:49.935437281Z" level=info msg="StartContainer for \"a68c87573a20d4306a3417eee669a7db9ebc0b17a0c43f950ee0d82adfeb57f4\"" Mar 17 17:59:49.970833 systemd[1]: Started cri-containerd-a68c87573a20d4306a3417eee669a7db9ebc0b17a0c43f950ee0d82adfeb57f4.scope - libcontainer container a68c87573a20d4306a3417eee669a7db9ebc0b17a0c43f950ee0d82adfeb57f4. Mar 17 17:59:50.040404 containerd[1527]: time="2025-03-17T17:59:50.040326152Z" level=info msg="StartContainer for \"a68c87573a20d4306a3417eee669a7db9ebc0b17a0c43f950ee0d82adfeb57f4\" returns successfully" Mar 17 17:59:51.286689 containerd[1527]: time="2025-03-17T17:59:51.286643076Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" Mar 17 17:59:51.287316 containerd[1527]: time="2025-03-17T17:59:51.286782503Z" level=info msg="TearDown network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" successfully" Mar 17 17:59:51.287316 containerd[1527]: time="2025-03-17T17:59:51.286796028Z" level=info msg="StopPodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" returns successfully" Mar 17 17:59:51.293323 containerd[1527]: time="2025-03-17T17:59:51.293273382Z" level=info msg="RemovePodSandbox for \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" Mar 17 17:59:51.303435 containerd[1527]: time="2025-03-17T17:59:51.303397280Z" level=info msg="Forcibly stopping sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\"" Mar 17 17:59:51.303585 containerd[1527]: time="2025-03-17T17:59:51.303520707Z" level=info msg="TearDown network for sandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" successfully" Mar 17 17:59:51.667309 containerd[1527]: time="2025-03-17T17:59:51.667171502Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.667309 containerd[1527]: time="2025-03-17T17:59:51.667245770Z" level=info msg="RemovePodSandbox \"ff01fb2ff506669d249784bb617df1d81f1295f48fed83808f2cd7ea84f964d5\" returns successfully" Mar 17 17:59:51.667851 containerd[1527]: time="2025-03-17T17:59:51.667821803Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\"" Mar 17 17:59:51.667958 containerd[1527]: time="2025-03-17T17:59:51.667929942Z" level=info msg="TearDown network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" successfully" Mar 17 17:59:51.667958 containerd[1527]: time="2025-03-17T17:59:51.667947055Z" level=info msg="StopPodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" returns successfully" Mar 17 17:59:51.668216 containerd[1527]: time="2025-03-17T17:59:51.668158184Z" level=info msg="RemovePodSandbox for \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\"" Mar 17 17:59:51.668216 containerd[1527]: time="2025-03-17T17:59:51.668189542Z" level=info msg="Forcibly stopping sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\"" Mar 17 17:59:51.668368 containerd[1527]: time="2025-03-17T17:59:51.668282864Z" level=info msg="TearDown network for sandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" successfully" Mar 17 17:59:51.679691 containerd[1527]: time="2025-03-17T17:59:51.679625905Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.679799 containerd[1527]: time="2025-03-17T17:59:51.679730789Z" level=info msg="RemovePodSandbox \"dac6a8bd40992872aa318c7602ce610c9def658e4b8a9dbf2c620cfec646e62d\" returns successfully" Mar 17 17:59:51.680340 containerd[1527]: time="2025-03-17T17:59:51.680154491Z" level=info msg="StopPodSandbox for \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\"" Mar 17 17:59:51.680340 containerd[1527]: time="2025-03-17T17:59:51.680268532Z" level=info msg="TearDown network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" successfully" Mar 17 17:59:51.680340 containerd[1527]: time="2025-03-17T17:59:51.680279622Z" level=info msg="StopPodSandbox for \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" returns successfully" Mar 17 17:59:51.680659 containerd[1527]: time="2025-03-17T17:59:51.680631632Z" level=info msg="RemovePodSandbox for \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\"" Mar 17 17:59:51.680723 containerd[1527]: time="2025-03-17T17:59:51.680661266Z" level=info msg="Forcibly stopping sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\"" Mar 17 17:59:51.680782 containerd[1527]: time="2025-03-17T17:59:51.680740623Z" level=info msg="TearDown network for sandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" successfully" Mar 17 17:59:51.684523 containerd[1527]: time="2025-03-17T17:59:51.684483747Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.684622 containerd[1527]: time="2025-03-17T17:59:51.684525554Z" level=info msg="RemovePodSandbox \"aa9ab902fd3d3f9e40f7985ffd7b17fe5ecffef26cfddf95577baef03915c59b\" returns successfully" Mar 17 17:59:51.684936 containerd[1527]: time="2025-03-17T17:59:51.684901638Z" level=info msg="StopPodSandbox for \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\"" Mar 17 17:59:51.685045 containerd[1527]: time="2025-03-17T17:59:51.685016731Z" level=info msg="TearDown network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\" successfully" Mar 17 17:59:51.685045 containerd[1527]: time="2025-03-17T17:59:51.685031338Z" level=info msg="StopPodSandbox for \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\" returns successfully" Mar 17 17:59:51.685322 containerd[1527]: time="2025-03-17T17:59:51.685296878Z" level=info msg="RemovePodSandbox for \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\"" Mar 17 17:59:51.685362 containerd[1527]: time="2025-03-17T17:59:51.685325471Z" level=info msg="Forcibly stopping sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\"" Mar 17 17:59:51.685441 containerd[1527]: time="2025-03-17T17:59:51.685400951Z" level=info msg="TearDown network for sandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\" successfully" Mar 17 17:59:51.689861 containerd[1527]: time="2025-03-17T17:59:51.689821165Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.689861 containerd[1527]: time="2025-03-17T17:59:51.689870206Z" level=info msg="RemovePodSandbox \"358bed422d8b9295f65b7df77d939e76926506fcf5c372af5e35c2fbcf165330\" returns successfully" Mar 17 17:59:51.690300 containerd[1527]: time="2025-03-17T17:59:51.690234307Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" Mar 17 17:59:51.690379 containerd[1527]: time="2025-03-17T17:59:51.690355692Z" level=info msg="TearDown network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" successfully" Mar 17 17:59:51.690411 containerd[1527]: time="2025-03-17T17:59:51.690375780Z" level=info msg="StopPodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" returns successfully" Mar 17 17:59:51.690732 containerd[1527]: time="2025-03-17T17:59:51.690694178Z" level=info msg="RemovePodSandbox for \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" Mar 17 17:59:51.690798 containerd[1527]: time="2025-03-17T17:59:51.690729743Z" level=info msg="Forcibly stopping sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\"" Mar 17 17:59:51.690858 containerd[1527]: time="2025-03-17T17:59:51.690821412Z" level=info msg="TearDown network for sandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" successfully" Mar 17 17:59:51.695808 containerd[1527]: time="2025-03-17T17:59:51.695740758Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.695808 containerd[1527]: time="2025-03-17T17:59:51.695821527Z" level=info msg="RemovePodSandbox \"47c1c02d1732bc9d45c3341a2419f1d53215ee638b7935ed0bf75a0f771b4664\" returns successfully" Mar 17 17:59:51.696220 containerd[1527]: time="2025-03-17T17:59:51.696152568Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\"" Mar 17 17:59:51.696367 containerd[1527]: time="2025-03-17T17:59:51.696258985Z" level=info msg="TearDown network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" successfully" Mar 17 17:59:51.696367 containerd[1527]: time="2025-03-17T17:59:51.696271819Z" level=info msg="StopPodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" returns successfully" Mar 17 17:59:51.696624 containerd[1527]: time="2025-03-17T17:59:51.696529605Z" level=info msg="RemovePodSandbox for \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\"" Mar 17 17:59:51.696624 containerd[1527]: time="2025-03-17T17:59:51.696555152Z" level=info msg="Forcibly stopping sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\"" Mar 17 17:59:51.696743 containerd[1527]: time="2025-03-17T17:59:51.696695712Z" level=info msg="TearDown network for sandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" successfully" Mar 17 17:59:51.701026 containerd[1527]: time="2025-03-17T17:59:51.700970347Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.701151 containerd[1527]: time="2025-03-17T17:59:51.701066695Z" level=info msg="RemovePodSandbox \"c68cef4308ec295a7c9296ff51fa3ab044170ea8a1e1f6e6633a6f4e62af79e5\" returns successfully" Mar 17 17:59:51.701594 containerd[1527]: time="2025-03-17T17:59:51.701559625Z" level=info msg="StopPodSandbox for \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\"" Mar 17 17:59:51.701684 containerd[1527]: time="2025-03-17T17:59:51.701669047Z" level=info msg="TearDown network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" successfully" Mar 17 17:59:51.701715 containerd[1527]: time="2025-03-17T17:59:51.701682372Z" level=info msg="StopPodSandbox for \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" returns successfully" Mar 17 17:59:51.701954 containerd[1527]: time="2025-03-17T17:59:51.701923547Z" level=info msg="RemovePodSandbox for \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\"" Mar 17 17:59:51.701954 containerd[1527]: time="2025-03-17T17:59:51.701944646Z" level=info msg="Forcibly stopping sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\"" Mar 17 17:59:51.702078 containerd[1527]: time="2025-03-17T17:59:51.702009767Z" level=info msg="TearDown network for sandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" successfully" Mar 17 17:59:51.705626 containerd[1527]: time="2025-03-17T17:59:51.705570543Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.705724 containerd[1527]: time="2025-03-17T17:59:51.705639611Z" level=info msg="RemovePodSandbox \"b13dd7d0d89481668a16cad35a6acd8c37446cce625b4902779420914a7a5b07\" returns successfully" Mar 17 17:59:51.706075 containerd[1527]: time="2025-03-17T17:59:51.706045600Z" level=info msg="StopPodSandbox for \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\"" Mar 17 17:59:51.706215 containerd[1527]: time="2025-03-17T17:59:51.706187482Z" level=info msg="TearDown network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\" successfully" Mar 17 17:59:51.706215 containerd[1527]: time="2025-03-17T17:59:51.706205837Z" level=info msg="StopPodSandbox for \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\" returns successfully" Mar 17 17:59:51.706545 containerd[1527]: time="2025-03-17T17:59:51.706518514Z" level=info msg="RemovePodSandbox for \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\"" Mar 17 17:59:51.706545 containerd[1527]: time="2025-03-17T17:59:51.706541256Z" level=info msg="Forcibly stopping sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\"" Mar 17 17:59:51.706682 containerd[1527]: time="2025-03-17T17:59:51.706638044Z" level=info msg="TearDown network for sandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\" successfully" Mar 17 17:59:51.813600 containerd[1527]: time="2025-03-17T17:59:51.812193941Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:51.813600 containerd[1527]: time="2025-03-17T17:59:51.812292433Z" level=info msg="RemovePodSandbox \"ed684fe1e1c1385c6ad93d561cf0ce772ccdb5322ccea204e3605746bd955f09\" returns successfully" Mar 17 17:59:51.813600 containerd[1527]: time="2025-03-17T17:59:51.812844333Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" Mar 17 17:59:51.813600 containerd[1527]: time="2025-03-17T17:59:51.812977398Z" level=info msg="TearDown network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" successfully" Mar 17 17:59:51.813600 containerd[1527]: time="2025-03-17T17:59:51.813008435Z" level=info msg="StopPodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" returns successfully" Mar 17 17:59:51.814838 containerd[1527]: time="2025-03-17T17:59:51.814811926Z" level=info msg="RemovePodSandbox for \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" Mar 17 17:59:51.815014 containerd[1527]: time="2025-03-17T17:59:51.814934092Z" level=info msg="Forcibly stopping sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\"" Mar 17 17:59:51.815482 containerd[1527]: time="2025-03-17T17:59:51.815420841Z" level=info msg="TearDown network for sandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" successfully" Mar 17 17:59:52.806390 containerd[1527]: time="2025-03-17T17:59:52.806327191Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.807027 containerd[1527]: time="2025-03-17T17:59:52.806407710Z" level=info msg="RemovePodSandbox \"175da80da3ca9c1477474587caeb240f7e14636f3c4c9adcdb7f4fe631316610\" returns successfully" Mar 17 17:59:52.807027 containerd[1527]: time="2025-03-17T17:59:52.806857070Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\"" Mar 17 17:59:52.807027 containerd[1527]: time="2025-03-17T17:59:52.806956024Z" level=info msg="TearDown network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" successfully" Mar 17 17:59:52.807027 containerd[1527]: time="2025-03-17T17:59:52.807001178Z" level=info msg="StopPodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" returns successfully" Mar 17 17:59:52.807294 containerd[1527]: time="2025-03-17T17:59:52.807229770Z" level=info msg="RemovePodSandbox for \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\"" Mar 17 17:59:52.807294 containerd[1527]: time="2025-03-17T17:59:52.807252452Z" level=info msg="Forcibly stopping sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\"" Mar 17 17:59:52.807388 containerd[1527]: time="2025-03-17T17:59:52.807336517Z" level=info msg="TearDown network for sandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" successfully" Mar 17 17:59:52.819420 containerd[1527]: time="2025-03-17T17:59:52.819383832Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.819499 containerd[1527]: time="2025-03-17T17:59:52.819447239Z" level=info msg="RemovePodSandbox \"22f99451326c8db19e0f10e1a733c7d3c321a69392b37a38adc9add244df4f4b\" returns successfully" Mar 17 17:59:52.819938 containerd[1527]: time="2025-03-17T17:59:52.819771599Z" level=info msg="StopPodSandbox for \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\"" Mar 17 17:59:52.819938 containerd[1527]: time="2025-03-17T17:59:52.819920764Z" level=info msg="TearDown network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" successfully" Mar 17 17:59:52.819938 containerd[1527]: time="2025-03-17T17:59:52.819934881Z" level=info msg="StopPodSandbox for \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" returns successfully" Mar 17 17:59:52.820196 containerd[1527]: time="2025-03-17T17:59:52.820171869Z" level=info msg="RemovePodSandbox for \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\"" Mar 17 17:59:52.820259 containerd[1527]: time="2025-03-17T17:59:52.820196094Z" level=info msg="Forcibly stopping sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\"" Mar 17 17:59:52.820309 containerd[1527]: time="2025-03-17T17:59:52.820269600Z" level=info msg="TearDown network for sandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" successfully" Mar 17 17:59:52.824503 containerd[1527]: time="2025-03-17T17:59:52.824456525Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.824565 containerd[1527]: time="2025-03-17T17:59:52.824502210Z" level=info msg="RemovePodSandbox \"6b8213bfa605189addf6cfa5631497ef45b433549351329ff1105252f5c73d8d\" returns successfully" Mar 17 17:59:52.824837 containerd[1527]: time="2025-03-17T17:59:52.824812373Z" level=info msg="StopPodSandbox for \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\"" Mar 17 17:59:52.824934 containerd[1527]: time="2025-03-17T17:59:52.824909513Z" level=info msg="TearDown network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\" successfully" Mar 17 17:59:52.824934 containerd[1527]: time="2025-03-17T17:59:52.824927196Z" level=info msg="StopPodSandbox for \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\" returns successfully" Mar 17 17:59:52.825753 containerd[1527]: time="2025-03-17T17:59:52.825254861Z" level=info msg="RemovePodSandbox for \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\"" Mar 17 17:59:52.825753 containerd[1527]: time="2025-03-17T17:59:52.825278205Z" level=info msg="Forcibly stopping sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\"" Mar 17 17:59:52.825753 containerd[1527]: time="2025-03-17T17:59:52.825364144Z" level=info msg="TearDown network for sandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\" successfully" Mar 17 17:59:52.831313 containerd[1527]: time="2025-03-17T17:59:52.831275468Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.831383 containerd[1527]: time="2025-03-17T17:59:52.831317897Z" level=info msg="RemovePodSandbox \"66e11466e57569371915b31cb4606d0224264adb1b24a33f641b3ac4a7fcd41c\" returns successfully" Mar 17 17:59:52.831559 containerd[1527]: time="2025-03-17T17:59:52.831531872Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" Mar 17 17:59:52.831708 containerd[1527]: time="2025-03-17T17:59:52.831688673Z" level=info msg="TearDown network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" successfully" Mar 17 17:59:52.831749 containerd[1527]: time="2025-03-17T17:59:52.831707056Z" level=info msg="StopPodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" returns successfully" Mar 17 17:59:52.833615 containerd[1527]: time="2025-03-17T17:59:52.831930529Z" level=info msg="RemovePodSandbox for \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" Mar 17 17:59:52.833615 containerd[1527]: time="2025-03-17T17:59:52.831972327Z" level=info msg="Forcibly stopping sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\"" Mar 17 17:59:52.833615 containerd[1527]: time="2025-03-17T17:59:52.832049379Z" level=info msg="TearDown network for sandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" successfully" Mar 17 17:59:52.836392 containerd[1527]: time="2025-03-17T17:59:52.836364651Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.836460 containerd[1527]: time="2025-03-17T17:59:52.836411128Z" level=info msg="RemovePodSandbox \"03eeefefe13e76dba68e21007a8ad3966309fd8bfa687c5f86339b482ff43815\" returns successfully" Mar 17 17:59:52.836690 containerd[1527]: time="2025-03-17T17:59:52.836665287Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\"" Mar 17 17:59:52.836787 containerd[1527]: time="2025-03-17T17:59:52.836763159Z" level=info msg="TearDown network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" successfully" Mar 17 17:59:52.836831 containerd[1527]: time="2025-03-17T17:59:52.836784307Z" level=info msg="StopPodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" returns successfully" Mar 17 17:59:52.837126 containerd[1527]: time="2025-03-17T17:59:52.837000327Z" level=info msg="RemovePodSandbox for \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\"" Mar 17 17:59:52.837126 containerd[1527]: time="2025-03-17T17:59:52.837035062Z" level=info msg="Forcibly stopping sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\"" Mar 17 17:59:52.837126 containerd[1527]: time="2025-03-17T17:59:52.837107455Z" level=info msg="TearDown network for sandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" successfully" Mar 17 17:59:52.842077 containerd[1527]: time="2025-03-17T17:59:52.841924135Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.842077 containerd[1527]: time="2025-03-17T17:59:52.841963377Z" level=info msg="RemovePodSandbox \"574d59b620d07170536ad7d7c836be88ffcac001578e2a1cc0c31aa3ff54d4fc\" returns successfully" Mar 17 17:59:52.842329 containerd[1527]: time="2025-03-17T17:59:52.842277659Z" level=info msg="StopPodSandbox for \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\"" Mar 17 17:59:52.842412 containerd[1527]: time="2025-03-17T17:59:52.842388605Z" level=info msg="TearDown network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" successfully" Mar 17 17:59:52.842412 containerd[1527]: time="2025-03-17T17:59:52.842405185Z" level=info msg="StopPodSandbox for \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" returns successfully" Mar 17 17:59:52.842647 containerd[1527]: time="2025-03-17T17:59:52.842621626Z" level=info msg="RemovePodSandbox for \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\"" Mar 17 17:59:52.842706 containerd[1527]: time="2025-03-17T17:59:52.842647774Z" level=info msg="Forcibly stopping sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\"" Mar 17 17:59:52.842756 containerd[1527]: time="2025-03-17T17:59:52.842721881Z" level=info msg="TearDown network for sandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" successfully" Mar 17 17:59:52.849156 containerd[1527]: time="2025-03-17T17:59:52.849118963Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.849240 containerd[1527]: time="2025-03-17T17:59:52.849166061Z" level=info msg="RemovePodSandbox \"607eb730050b2aee59fc6a9be9950eebe1e0025c46ed47da43f24b05da799c08\" returns successfully" Mar 17 17:59:52.849656 containerd[1527]: time="2025-03-17T17:59:52.849613659Z" level=info msg="StopPodSandbox for \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\"" Mar 17 17:59:52.849852 containerd[1527]: time="2025-03-17T17:59:52.849831812Z" level=info msg="TearDown network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\" successfully" Mar 17 17:59:52.849852 containerd[1527]: time="2025-03-17T17:59:52.849850427Z" level=info msg="StopPodSandbox for \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\" returns successfully" Mar 17 17:59:52.850234 containerd[1527]: time="2025-03-17T17:59:52.850201256Z" level=info msg="RemovePodSandbox for \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\"" Mar 17 17:59:52.850234 containerd[1527]: time="2025-03-17T17:59:52.850225941Z" level=info msg="Forcibly stopping sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\"" Mar 17 17:59:52.850368 containerd[1527]: time="2025-03-17T17:59:52.850302613Z" level=info msg="TearDown network for sandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\" successfully" Mar 17 17:59:52.857051 containerd[1527]: time="2025-03-17T17:59:52.857025038Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.857109 containerd[1527]: time="2025-03-17T17:59:52.857063519Z" level=info msg="RemovePodSandbox \"4727b14246c3e5d8617b2e242cb1b9556226b370c037221b02fa2b0ced281c39\" returns successfully" Mar 17 17:59:52.857381 containerd[1527]: time="2025-03-17T17:59:52.857364485Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" Mar 17 17:59:52.857455 containerd[1527]: time="2025-03-17T17:59:52.857438533Z" level=info msg="TearDown network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" successfully" Mar 17 17:59:52.857455 containerd[1527]: time="2025-03-17T17:59:52.857450494Z" level=info msg="StopPodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" returns successfully" Mar 17 17:59:52.857909 containerd[1527]: time="2025-03-17T17:59:52.857744519Z" level=info msg="RemovePodSandbox for \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" Mar 17 17:59:52.857909 containerd[1527]: time="2025-03-17T17:59:52.857769174Z" level=info msg="Forcibly stopping sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\"" Mar 17 17:59:52.857909 containerd[1527]: time="2025-03-17T17:59:52.857853019Z" level=info msg="TearDown network for sandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" successfully" Mar 17 17:59:52.894096 containerd[1527]: time="2025-03-17T17:59:52.894059471Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.894337 containerd[1527]: time="2025-03-17T17:59:52.894254081Z" level=info msg="RemovePodSandbox \"e267d1516e64a4459bd1e3efc4e7cf926989bcec8067cbb09c2d0ce4955b89c6\" returns successfully" Mar 17 17:59:52.894628 containerd[1527]: time="2025-03-17T17:59:52.894606042Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\"" Mar 17 17:59:52.894713 containerd[1527]: time="2025-03-17T17:59:52.894699635Z" level=info msg="TearDown network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" successfully" Mar 17 17:59:52.894751 containerd[1527]: time="2025-03-17T17:59:52.894711968Z" level=info msg="StopPodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" returns successfully" Mar 17 17:59:52.894983 containerd[1527]: time="2025-03-17T17:59:52.894964325Z" level=info msg="RemovePodSandbox for \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\"" Mar 17 17:59:52.894983 containerd[1527]: time="2025-03-17T17:59:52.894982439Z" level=info msg="Forcibly stopping sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\"" Mar 17 17:59:52.895137 containerd[1527]: time="2025-03-17T17:59:52.895039433Z" level=info msg="TearDown network for sandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" successfully" Mar 17 17:59:52.902793 containerd[1527]: time="2025-03-17T17:59:52.902757791Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.902879 containerd[1527]: time="2025-03-17T17:59:52.902804077Z" level=info msg="RemovePodSandbox \"dfcf98b5c5bc161a9f4c75beeec097db762d06de7dab74742b20d39e59c9ffd2\" returns successfully" Mar 17 17:59:52.903296 containerd[1527]: time="2025-03-17T17:59:52.903270649Z" level=info msg="StopPodSandbox for \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\"" Mar 17 17:59:52.903394 containerd[1527]: time="2025-03-17T17:59:52.903374491Z" level=info msg="TearDown network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" successfully" Mar 17 17:59:52.903394 containerd[1527]: time="2025-03-17T17:59:52.903386564Z" level=info msg="StopPodSandbox for \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" returns successfully" Mar 17 17:59:52.903725 containerd[1527]: time="2025-03-17T17:59:52.903674306Z" level=info msg="RemovePodSandbox for \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\"" Mar 17 17:59:52.903725 containerd[1527]: time="2025-03-17T17:59:52.903701426Z" level=info msg="Forcibly stopping sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\"" Mar 17 17:59:52.904096 containerd[1527]: time="2025-03-17T17:59:52.903939466Z" level=info msg="TearDown network for sandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" successfully" Mar 17 17:59:52.910488 containerd[1527]: time="2025-03-17T17:59:52.910460939Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.910612 containerd[1527]: time="2025-03-17T17:59:52.910506083Z" level=info msg="RemovePodSandbox \"62b2b56764117c20274bf6a26dd52cd73b083b9d482a4ea5bbe12c8b16e8fb26\" returns successfully" Mar 17 17:59:52.910862 containerd[1527]: time="2025-03-17T17:59:52.910825694Z" level=info msg="StopPodSandbox for \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\"" Mar 17 17:59:52.910961 containerd[1527]: time="2025-03-17T17:59:52.910944614Z" level=info msg="TearDown network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\" successfully" Mar 17 17:59:52.911002 containerd[1527]: time="2025-03-17T17:59:52.910958740Z" level=info msg="StopPodSandbox for \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\" returns successfully" Mar 17 17:59:52.911249 containerd[1527]: time="2025-03-17T17:59:52.911230933Z" level=info msg="RemovePodSandbox for \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\"" Mar 17 17:59:52.912615 containerd[1527]: time="2025-03-17T17:59:52.911394256Z" level=info msg="Forcibly stopping sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\"" Mar 17 17:59:52.912615 containerd[1527]: time="2025-03-17T17:59:52.911473001Z" level=info msg="TearDown network for sandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\" successfully" Mar 17 17:59:52.917084 containerd[1527]: time="2025-03-17T17:59:52.917058694Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.917179 containerd[1527]: time="2025-03-17T17:59:52.917099128Z" level=info msg="RemovePodSandbox \"9ddf492caf3274b3c82612b64410d4d69dfa875d8b8d9b19526874fa1576c084\" returns successfully" Mar 17 17:59:52.917429 containerd[1527]: time="2025-03-17T17:59:52.917400055Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" Mar 17 17:59:52.917501 containerd[1527]: time="2025-03-17T17:59:52.917482046Z" level=info msg="TearDown network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" successfully" Mar 17 17:59:52.917501 containerd[1527]: time="2025-03-17T17:59:52.917495331Z" level=info msg="StopPodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" returns successfully" Mar 17 17:59:52.919862 containerd[1527]: time="2025-03-17T17:59:52.917725256Z" level=info msg="RemovePodSandbox for \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" Mar 17 17:59:52.919862 containerd[1527]: time="2025-03-17T17:59:52.917749421Z" level=info msg="Forcibly stopping sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\"" Mar 17 17:59:52.919862 containerd[1527]: time="2025-03-17T17:59:52.917853233Z" level=info msg="TearDown network for sandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" successfully" Mar 17 17:59:52.923047 containerd[1527]: time="2025-03-17T17:59:52.923023898Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.923178 containerd[1527]: time="2025-03-17T17:59:52.923145362Z" level=info msg="RemovePodSandbox \"2d682889958eea63a38bc8e4ff6b2bec98fc22cfd2f12a4e53997d3fc35a57fb\" returns successfully" Mar 17 17:59:52.923493 containerd[1527]: time="2025-03-17T17:59:52.923472357Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\"" Mar 17 17:59:52.923651 containerd[1527]: time="2025-03-17T17:59:52.923632032Z" level=info msg="TearDown network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" successfully" Mar 17 17:59:52.923754 containerd[1527]: time="2025-03-17T17:59:52.923736005Z" level=info msg="StopPodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" returns successfully" Mar 17 17:59:52.924076 containerd[1527]: time="2025-03-17T17:59:52.924041800Z" level=info msg="RemovePodSandbox for \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\"" Mar 17 17:59:52.924076 containerd[1527]: time="2025-03-17T17:59:52.924067137Z" level=info msg="Forcibly stopping sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\"" Mar 17 17:59:52.924202 containerd[1527]: time="2025-03-17T17:59:52.924166451Z" level=info msg="TearDown network for sandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" successfully" Mar 17 17:59:52.929338 containerd[1527]: time="2025-03-17T17:59:52.929308573Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.929418 containerd[1527]: time="2025-03-17T17:59:52.929360578Z" level=info msg="RemovePodSandbox \"f9d792c0b230948625c4596046bb3ae54c4501bc925d6952747ab9bc0cb4da99\" returns successfully" Mar 17 17:59:52.929724 containerd[1527]: time="2025-03-17T17:59:52.929700077Z" level=info msg="StopPodSandbox for \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\"" Mar 17 17:59:52.929824 containerd[1527]: time="2025-03-17T17:59:52.929805722Z" level=info msg="TearDown network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" successfully" Mar 17 17:59:52.929824 containerd[1527]: time="2025-03-17T17:59:52.929821481Z" level=info msg="StopPodSandbox for \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" returns successfully" Mar 17 17:59:52.930190 containerd[1527]: time="2025-03-17T17:59:52.930169505Z" level=info msg="RemovePodSandbox for \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\"" Mar 17 17:59:52.930242 containerd[1527]: time="2025-03-17T17:59:52.930190754Z" level=info msg="Forcibly stopping sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\"" Mar 17 17:59:52.930295 containerd[1527]: time="2025-03-17T17:59:52.930265753Z" level=info msg="TearDown network for sandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" successfully" Mar 17 17:59:52.934463 containerd[1527]: time="2025-03-17T17:59:52.934439223Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.934543 containerd[1527]: time="2025-03-17T17:59:52.934476732Z" level=info msg="RemovePodSandbox \"604cc4fa7e67c9bf051afc3253f8f9196aa43c0b226fdea7d5fa1e1437c0c5d2\" returns successfully" Mar 17 17:59:52.934734 containerd[1527]: time="2025-03-17T17:59:52.934704543Z" level=info msg="StopPodSandbox for \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\"" Mar 17 17:59:52.934819 containerd[1527]: time="2025-03-17T17:59:52.934797305Z" level=info msg="TearDown network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\" successfully" Mar 17 17:59:52.934819 containerd[1527]: time="2025-03-17T17:59:52.934813204Z" level=info msg="StopPodSandbox for \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\" returns successfully" Mar 17 17:59:52.935043 containerd[1527]: time="2025-03-17T17:59:52.935019797Z" level=info msg="RemovePodSandbox for \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\"" Mar 17 17:59:52.935086 containerd[1527]: time="2025-03-17T17:59:52.935043581Z" level=info msg="Forcibly stopping sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\"" Mar 17 17:59:52.935160 containerd[1527]: time="2025-03-17T17:59:52.935117167Z" level=info msg="TearDown network for sandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\" successfully" Mar 17 17:59:52.940695 containerd[1527]: time="2025-03-17T17:59:52.940655280Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:59:52.940764 containerd[1527]: time="2025-03-17T17:59:52.940694884Z" level=info msg="RemovePodSandbox \"a95ae50187efca353f8fd52b46cc94a7090108607ac367668a5c13e709014be3\" returns successfully" Mar 17 17:59:53.046861 systemd[1]: Started sshd@14-10.0.0.118:22-10.0.0.1:58816.service - OpenSSH per-connection server daemon (10.0.0.1:58816). Mar 17 17:59:53.094256 sshd[5560]: Accepted publickey for core from 10.0.0.1 port 58816 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:53.098326 sshd-session[5560]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:53.104173 systemd-logind[1507]: New session 15 of user core. Mar 17 17:59:53.110206 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 17 17:59:53.260101 sshd[5562]: Connection closed by 10.0.0.1 port 58816 Mar 17 17:59:53.260881 sshd-session[5560]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:53.265966 systemd[1]: sshd@14-10.0.0.118:22-10.0.0.1:58816.service: Deactivated successfully. Mar 17 17:59:53.269802 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 17:59:53.275522 systemd-logind[1507]: Session 15 logged out. Waiting for processes to exit. Mar 17 17:59:53.277768 systemd-logind[1507]: Removed session 15. Mar 17 17:59:53.728445 containerd[1527]: time="2025-03-17T17:59:53.728388738Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:53.803643 containerd[1527]: time="2025-03-17T17:59:53.803551161Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 17 17:59:53.840310 containerd[1527]: time="2025-03-17T17:59:53.840257706Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:53.863515 containerd[1527]: time="2025-03-17T17:59:53.863444649Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:53.864247 containerd[1527]: time="2025-03-17T17:59:53.864217079Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 4.167576987s" Mar 17 17:59:53.864309 containerd[1527]: time="2025-03-17T17:59:53.864252334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 17 17:59:53.865397 containerd[1527]: time="2025-03-17T17:59:53.865262626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:59:53.866426 containerd[1527]: time="2025-03-17T17:59:53.866401045Z" level=info msg="CreateContainer within sandbox \"37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:59:54.101966 containerd[1527]: time="2025-03-17T17:59:54.101850790Z" level=info msg="CreateContainer within sandbox \"37a7a858ed3513846857032f2500964f2433f3e1c2946fb56d3fa4e140e1a3df\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5d977e7208d3fecb26390ac42ea0c9141318813a54f86b9c93733c307200be8e\"" Mar 17 17:59:54.102420 containerd[1527]: time="2025-03-17T17:59:54.102391693Z" level=info msg="StartContainer for \"5d977e7208d3fecb26390ac42ea0c9141318813a54f86b9c93733c307200be8e\"" Mar 17 17:59:54.136729 systemd[1]: Started cri-containerd-5d977e7208d3fecb26390ac42ea0c9141318813a54f86b9c93733c307200be8e.scope - libcontainer container 5d977e7208d3fecb26390ac42ea0c9141318813a54f86b9c93733c307200be8e. Mar 17 17:59:54.177735 containerd[1527]: time="2025-03-17T17:59:54.177697307Z" level=info msg="StartContainer for \"5d977e7208d3fecb26390ac42ea0c9141318813a54f86b9c93733c307200be8e\" returns successfully" Mar 17 17:59:54.354326 containerd[1527]: time="2025-03-17T17:59:54.354173074Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:54.355244 containerd[1527]: time="2025-03-17T17:59:54.355175103Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 17 17:59:54.358595 containerd[1527]: time="2025-03-17T17:59:54.358353739Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 493.062791ms" Mar 17 17:59:54.358595 containerd[1527]: time="2025-03-17T17:59:54.358385408Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 17 17:59:54.360096 containerd[1527]: time="2025-03-17T17:59:54.360060676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 17:59:54.362227 containerd[1527]: time="2025-03-17T17:59:54.362197139Z" level=info msg="CreateContainer within sandbox \"c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:59:54.375844 containerd[1527]: time="2025-03-17T17:59:54.375806795Z" level=info msg="CreateContainer within sandbox \"c8a259c4e082d2f3b603b0ec59816cfe9758e4ba08c1abb14a33a99fdbe81942\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"848d6d34218f2d27d799201e67e49ac9b96f456dffffeb66a828c14ee94e2ff5\"" Mar 17 17:59:54.376301 containerd[1527]: time="2025-03-17T17:59:54.376270536Z" level=info msg="StartContainer for \"848d6d34218f2d27d799201e67e49ac9b96f456dffffeb66a828c14ee94e2ff5\"" Mar 17 17:59:54.412762 systemd[1]: Started cri-containerd-848d6d34218f2d27d799201e67e49ac9b96f456dffffeb66a828c14ee94e2ff5.scope - libcontainer container 848d6d34218f2d27d799201e67e49ac9b96f456dffffeb66a828c14ee94e2ff5. Mar 17 17:59:54.463789 containerd[1527]: time="2025-03-17T17:59:54.463733752Z" level=info msg="StartContainer for \"848d6d34218f2d27d799201e67e49ac9b96f456dffffeb66a828c14ee94e2ff5\" returns successfully" Mar 17 17:59:54.613691 kubelet[2721]: I0317 17:59:54.612863 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5df8c6645f-9hw6m" podStartSLOduration=30.007103068 podStartE2EDuration="39.612851078s" podCreationTimestamp="2025-03-17 17:59:15 +0000 UTC" firstStartedPulling="2025-03-17 17:59:44.75355469 +0000 UTC m=+53.544397956" lastFinishedPulling="2025-03-17 17:59:54.3593027 +0000 UTC m=+63.150145966" observedRunningTime="2025-03-17 17:59:54.612478097 +0000 UTC m=+63.403321363" watchObservedRunningTime="2025-03-17 17:59:54.612851078 +0000 UTC m=+63.403694344" Mar 17 17:59:55.192627 kubelet[2721]: I0317 17:59:55.192186 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5df8c6645f-bfh8t" podStartSLOduration=30.974101907 podStartE2EDuration="40.192167298s" podCreationTimestamp="2025-03-17 17:59:15 +0000 UTC" firstStartedPulling="2025-03-17 17:59:44.646959475 +0000 UTC m=+53.437802741" lastFinishedPulling="2025-03-17 17:59:53.865024866 +0000 UTC m=+62.655868132" observedRunningTime="2025-03-17 17:59:54.625977748 +0000 UTC m=+63.416821014" watchObservedRunningTime="2025-03-17 17:59:55.192167298 +0000 UTC m=+63.983010564" Mar 17 17:59:55.601259 kubelet[2721]: I0317 17:59:55.601018 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:59:56.052250 containerd[1527]: time="2025-03-17T17:59:56.052191684Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:56.053015 containerd[1527]: time="2025-03-17T17:59:56.052973448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 17 17:59:56.054019 containerd[1527]: time="2025-03-17T17:59:56.053988015Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:56.056124 containerd[1527]: time="2025-03-17T17:59:56.056093000Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:59:56.056744 containerd[1527]: time="2025-03-17T17:59:56.056703716Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 1.696603357s" Mar 17 17:59:56.056787 containerd[1527]: time="2025-03-17T17:59:56.056742188Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 17 17:59:56.058855 containerd[1527]: time="2025-03-17T17:59:56.058802690Z" level=info msg="CreateContainer within sandbox \"45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 17:59:56.073337 containerd[1527]: time="2025-03-17T17:59:56.073291679Z" level=info msg="CreateContainer within sandbox \"45fd2fb48e3668cf3c65ec93cdb0f826c5540f8f66ecb1e0326f1d93a8b8ec0f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4b7700e37d63c5809c11477ba5fe694e195689363507c7ef0e94f8b106957d62\"" Mar 17 17:59:56.073775 containerd[1527]: time="2025-03-17T17:59:56.073742258Z" level=info msg="StartContainer for \"4b7700e37d63c5809c11477ba5fe694e195689363507c7ef0e94f8b106957d62\"" Mar 17 17:59:56.099279 systemd[1]: run-containerd-runc-k8s.io-4b7700e37d63c5809c11477ba5fe694e195689363507c7ef0e94f8b106957d62-runc.1PYogC.mount: Deactivated successfully. Mar 17 17:59:56.116838 systemd[1]: Started cri-containerd-4b7700e37d63c5809c11477ba5fe694e195689363507c7ef0e94f8b106957d62.scope - libcontainer container 4b7700e37d63c5809c11477ba5fe694e195689363507c7ef0e94f8b106957d62. Mar 17 17:59:56.149522 containerd[1527]: time="2025-03-17T17:59:56.149480961Z" level=info msg="StartContainer for \"4b7700e37d63c5809c11477ba5fe694e195689363507c7ef0e94f8b106957d62\" returns successfully" Mar 17 17:59:56.355291 kubelet[2721]: I0317 17:59:56.355161 2721 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 17:59:56.355291 kubelet[2721]: I0317 17:59:56.355200 2721 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 17:59:56.618271 kubelet[2721]: I0317 17:59:56.618101 2721 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-kdwpf" podStartSLOduration=29.623489216 podStartE2EDuration="41.618082279s" podCreationTimestamp="2025-03-17 17:59:15 +0000 UTC" firstStartedPulling="2025-03-17 17:59:44.062782713 +0000 UTC m=+52.853625980" lastFinishedPulling="2025-03-17 17:59:56.057375777 +0000 UTC m=+64.848219043" observedRunningTime="2025-03-17 17:59:56.615834418 +0000 UTC m=+65.406677684" watchObservedRunningTime="2025-03-17 17:59:56.618082279 +0000 UTC m=+65.408925545" Mar 17 17:59:58.274936 systemd[1]: Started sshd@15-10.0.0.118:22-10.0.0.1:50706.service - OpenSSH per-connection server daemon (10.0.0.1:50706). Mar 17 17:59:58.322906 sshd[5741]: Accepted publickey for core from 10.0.0.1 port 50706 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 17:59:58.325360 sshd-session[5741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:59:58.330128 systemd-logind[1507]: New session 16 of user core. Mar 17 17:59:58.340702 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 17 17:59:58.458917 sshd[5743]: Connection closed by 10.0.0.1 port 50706 Mar 17 17:59:58.459297 sshd-session[5741]: pam_unix(sshd:session): session closed for user core Mar 17 17:59:58.463683 systemd[1]: sshd@15-10.0.0.118:22-10.0.0.1:50706.service: Deactivated successfully. Mar 17 17:59:58.465884 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 17:59:58.466672 systemd-logind[1507]: Session 16 logged out. Waiting for processes to exit. Mar 17 17:59:58.467846 systemd-logind[1507]: Removed session 16. Mar 17 17:59:59.280386 kubelet[2721]: I0317 17:59:59.280335 2721 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:00:01.657355 systemd[1]: run-containerd-runc-k8s.io-45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86-runc.DONDdP.mount: Deactivated successfully. Mar 17 18:00:02.286522 kubelet[2721]: E0317 18:00:02.286473 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:03.472016 systemd[1]: Started sshd@16-10.0.0.118:22-10.0.0.1:50722.service - OpenSSH per-connection server daemon (10.0.0.1:50722). Mar 17 18:00:03.514168 sshd[5779]: Accepted publickey for core from 10.0.0.1 port 50722 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:03.515609 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:03.519822 systemd-logind[1507]: New session 17 of user core. Mar 17 18:00:03.540698 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 17 18:00:03.653461 sshd[5781]: Connection closed by 10.0.0.1 port 50722 Mar 17 18:00:03.653838 sshd-session[5779]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:03.665668 systemd[1]: sshd@16-10.0.0.118:22-10.0.0.1:50722.service: Deactivated successfully. Mar 17 18:00:03.668236 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 18:00:03.670727 systemd-logind[1507]: Session 17 logged out. Waiting for processes to exit. Mar 17 18:00:03.676842 systemd[1]: Started sshd@17-10.0.0.118:22-10.0.0.1:50738.service - OpenSSH per-connection server daemon (10.0.0.1:50738). Mar 17 18:00:03.678485 systemd-logind[1507]: Removed session 17. Mar 17 18:00:03.715526 sshd[5794]: Accepted publickey for core from 10.0.0.1 port 50738 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:03.717050 sshd-session[5794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:03.721529 systemd-logind[1507]: New session 18 of user core. Mar 17 18:00:03.731704 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 17 18:00:04.013621 sshd[5797]: Connection closed by 10.0.0.1 port 50738 Mar 17 18:00:04.015283 sshd-session[5794]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:04.027156 systemd[1]: sshd@17-10.0.0.118:22-10.0.0.1:50738.service: Deactivated successfully. Mar 17 18:00:04.029266 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 18:00:04.030179 systemd-logind[1507]: Session 18 logged out. Waiting for processes to exit. Mar 17 18:00:04.036935 systemd[1]: Started sshd@18-10.0.0.118:22-10.0.0.1:50744.service - OpenSSH per-connection server daemon (10.0.0.1:50744). Mar 17 18:00:04.038236 systemd-logind[1507]: Removed session 18. Mar 17 18:00:04.082337 sshd[5808]: Accepted publickey for core from 10.0.0.1 port 50744 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:04.084211 sshd-session[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:04.088602 systemd-logind[1507]: New session 19 of user core. Mar 17 18:00:04.101704 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 17 18:00:06.842814 sshd[5811]: Connection closed by 10.0.0.1 port 50744 Mar 17 18:00:06.843562 sshd-session[5808]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:06.857754 systemd[1]: sshd@18-10.0.0.118:22-10.0.0.1:50744.service: Deactivated successfully. Mar 17 18:00:06.860440 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 18:00:06.860717 systemd[1]: session-19.scope: Consumed 549ms CPU time, 66M memory peak. Mar 17 18:00:06.861215 systemd-logind[1507]: Session 19 logged out. Waiting for processes to exit. Mar 17 18:00:06.870935 systemd[1]: Started sshd@19-10.0.0.118:22-10.0.0.1:44134.service - OpenSSH per-connection server daemon (10.0.0.1:44134). Mar 17 18:00:06.872243 systemd-logind[1507]: Removed session 19. Mar 17 18:00:06.912027 sshd[5837]: Accepted publickey for core from 10.0.0.1 port 44134 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:06.913625 sshd-session[5837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:06.918097 systemd-logind[1507]: New session 20 of user core. Mar 17 18:00:06.922715 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 17 18:00:07.130200 sshd[5840]: Connection closed by 10.0.0.1 port 44134 Mar 17 18:00:07.133096 sshd-session[5837]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:07.144556 systemd[1]: sshd@19-10.0.0.118:22-10.0.0.1:44134.service: Deactivated successfully. Mar 17 18:00:07.146438 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 18:00:07.147293 systemd-logind[1507]: Session 20 logged out. Waiting for processes to exit. Mar 17 18:00:07.158154 systemd[1]: Started sshd@20-10.0.0.118:22-10.0.0.1:44150.service - OpenSSH per-connection server daemon (10.0.0.1:44150). Mar 17 18:00:07.158974 systemd-logind[1507]: Removed session 20. Mar 17 18:00:07.196110 sshd[5850]: Accepted publickey for core from 10.0.0.1 port 44150 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:07.197442 sshd-session[5850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:07.201868 systemd-logind[1507]: New session 21 of user core. Mar 17 18:00:07.211694 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 17 18:00:07.323446 sshd[5853]: Connection closed by 10.0.0.1 port 44150 Mar 17 18:00:07.323848 sshd-session[5850]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:07.330094 systemd[1]: sshd@20-10.0.0.118:22-10.0.0.1:44150.service: Deactivated successfully. Mar 17 18:00:07.332971 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 18:00:07.333897 systemd-logind[1507]: Session 21 logged out. Waiting for processes to exit. Mar 17 18:00:07.335199 systemd-logind[1507]: Removed session 21. Mar 17 18:00:10.286275 kubelet[2721]: E0317 18:00:10.286237 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:12.335835 systemd[1]: Started sshd@21-10.0.0.118:22-10.0.0.1:44162.service - OpenSSH per-connection server daemon (10.0.0.1:44162). Mar 17 18:00:12.377219 sshd[5869]: Accepted publickey for core from 10.0.0.1 port 44162 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:12.378925 sshd-session[5869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:12.383462 systemd-logind[1507]: New session 22 of user core. Mar 17 18:00:12.390746 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 17 18:00:12.669895 sshd[5871]: Connection closed by 10.0.0.1 port 44162 Mar 17 18:00:12.670161 sshd-session[5869]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:12.674517 systemd[1]: sshd@21-10.0.0.118:22-10.0.0.1:44162.service: Deactivated successfully. Mar 17 18:00:12.676910 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 18:00:12.677742 systemd-logind[1507]: Session 22 logged out. Waiting for processes to exit. Mar 17 18:00:12.678699 systemd-logind[1507]: Removed session 22. Mar 17 18:00:16.769840 containerd[1527]: time="2025-03-17T18:00:16.769753976Z" level=info msg="StopContainer for \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\" with timeout 300 (s)" Mar 17 18:00:16.772560 containerd[1527]: time="2025-03-17T18:00:16.772523266Z" level=info msg="Stop container \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\" with signal terminated" Mar 17 18:00:17.003665 containerd[1527]: time="2025-03-17T18:00:17.003379941Z" level=info msg="StopContainer for \"45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86\" with timeout 30 (s)" Mar 17 18:00:17.003923 containerd[1527]: time="2025-03-17T18:00:17.003824384Z" level=info msg="Stop container \"45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86\" with signal terminated" Mar 17 18:00:17.021845 systemd[1]: cri-containerd-45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86.scope: Deactivated successfully. Mar 17 18:00:17.053623 containerd[1527]: time="2025-03-17T18:00:17.053391608Z" level=info msg="shim disconnected" id=45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86 namespace=k8s.io Mar 17 18:00:17.053623 containerd[1527]: time="2025-03-17T18:00:17.053454718Z" level=warning msg="cleaning up after shim disconnected" id=45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86 namespace=k8s.io Mar 17 18:00:17.053623 containerd[1527]: time="2025-03-17T18:00:17.053465428Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:17.055349 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86-rootfs.mount: Deactivated successfully. Mar 17 18:00:17.084305 containerd[1527]: time="2025-03-17T18:00:17.084252398Z" level=info msg="StopContainer for \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\" with timeout 5 (s)" Mar 17 18:00:17.084758 containerd[1527]: time="2025-03-17T18:00:17.084725777Z" level=info msg="Stop container \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\" with signal terminated" Mar 17 18:00:17.098767 containerd[1527]: time="2025-03-17T18:00:17.098697469Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:00:17Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 17 18:00:17.114640 containerd[1527]: time="2025-03-17T18:00:17.114555200Z" level=info msg="StopContainer for \"45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86\" returns successfully" Mar 17 18:00:17.115734 containerd[1527]: time="2025-03-17T18:00:17.115199462Z" level=info msg="StopPodSandbox for \"9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699\"" Mar 17 18:00:17.115734 containerd[1527]: time="2025-03-17T18:00:17.115243134Z" level=info msg="Container to stop \"45cd73cc9fbd064cef237d070ce9ca63361cd05080cb7e4206ccf81168630c86\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:00:17.119637 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699-shm.mount: Deactivated successfully. Mar 17 18:00:17.126685 systemd[1]: cri-containerd-4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc.scope: Deactivated successfully. Mar 17 18:00:17.127128 systemd[1]: cri-containerd-4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc.scope: Consumed 2.047s CPU time, 176.4M memory peak, 2.1M read from disk, 664K written to disk. Mar 17 18:00:17.128569 systemd[1]: cri-containerd-9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699.scope: Deactivated successfully. Mar 17 18:00:17.156654 containerd[1527]: time="2025-03-17T18:00:17.156534971Z" level=info msg="shim disconnected" id=4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc namespace=k8s.io Mar 17 18:00:17.156654 containerd[1527]: time="2025-03-17T18:00:17.156646122Z" level=warning msg="cleaning up after shim disconnected" id=4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc namespace=k8s.io Mar 17 18:00:17.156654 containerd[1527]: time="2025-03-17T18:00:17.156657033Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:17.159235 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc-rootfs.mount: Deactivated successfully. Mar 17 18:00:17.164328 containerd[1527]: time="2025-03-17T18:00:17.164261127Z" level=info msg="shim disconnected" id=9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699 namespace=k8s.io Mar 17 18:00:17.164328 containerd[1527]: time="2025-03-17T18:00:17.164319117Z" level=warning msg="cleaning up after shim disconnected" id=9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699 namespace=k8s.io Mar 17 18:00:17.164414 containerd[1527]: time="2025-03-17T18:00:17.164329957Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:17.458596 containerd[1527]: time="2025-03-17T18:00:17.458533082Z" level=info msg="StopContainer for \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\" returns successfully" Mar 17 18:00:17.459116 containerd[1527]: time="2025-03-17T18:00:17.459078737Z" level=info msg="StopPodSandbox for \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\"" Mar 17 18:00:17.459166 containerd[1527]: time="2025-03-17T18:00:17.459125536Z" level=info msg="Container to stop \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:00:17.459197 containerd[1527]: time="2025-03-17T18:00:17.459166654Z" level=info msg="Container to stop \"6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:00:17.459197 containerd[1527]: time="2025-03-17T18:00:17.459179267Z" level=info msg="Container to stop \"d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:00:17.466556 systemd[1]: cri-containerd-09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df.scope: Deactivated successfully. Mar 17 18:00:17.484443 containerd[1527]: time="2025-03-17T18:00:17.484381013Z" level=info msg="shim disconnected" id=09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df namespace=k8s.io Mar 17 18:00:17.484571 containerd[1527]: time="2025-03-17T18:00:17.484537480Z" level=warning msg="cleaning up after shim disconnected" id=09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df namespace=k8s.io Mar 17 18:00:17.484571 containerd[1527]: time="2025-03-17T18:00:17.484554602Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:17.632490 systemd-networkd[1459]: cali4a9dfd194d5: Link DOWN Mar 17 18:00:17.632500 systemd-networkd[1459]: cali4a9dfd194d5: Lost carrier Mar 17 18:00:17.648793 kubelet[2721]: I0317 18:00:17.648533 2721 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Mar 17 18:00:17.695175 systemd[1]: Started sshd@22-10.0.0.118:22-10.0.0.1:37962.service - OpenSSH per-connection server daemon (10.0.0.1:37962). Mar 17 18:00:17.713800 containerd[1527]: time="2025-03-17T18:00:17.713613113Z" level=info msg="TearDown network for sandbox \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" successfully" Mar 17 18:00:17.713800 containerd[1527]: time="2025-03-17T18:00:17.713652308Z" level=info msg="StopPodSandbox for \"09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df\" returns successfully" Mar 17 18:00:17.746825 sshd[6098]: Accepted publickey for core from 10.0.0.1 port 37962 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:17.748181 kubelet[2721]: I0317 18:00:17.748149 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-bin-dir\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748260 kubelet[2721]: I0317 18:00:17.748187 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-lib-modules\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748260 kubelet[2721]: I0317 18:00:17.748204 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-xtables-lock\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748260 kubelet[2721]: I0317 18:00:17.748226 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-flexvol-driver-host\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748260 kubelet[2721]: I0317 18:00:17.748246 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-net-dir\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748260 kubelet[2721]: I0317 18:00:17.748262 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-run-calico\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748405 kubelet[2721]: I0317 18:00:17.748266 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.748405 kubelet[2721]: I0317 18:00:17.748291 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pxfm\" (UniqueName: \"kubernetes.io/projected/e803e84a-5e61-46fb-91a6-3411250f4200-kube-api-access-5pxfm\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748405 kubelet[2721]: I0317 18:00:17.748314 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e803e84a-5e61-46fb-91a6-3411250f4200-tigera-ca-bundle\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748405 kubelet[2721]: I0317 18:00:17.748320 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.748405 kubelet[2721]: I0317 18:00:17.748338 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-log-dir\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748529 kubelet[2721]: I0317 18:00:17.748348 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.748529 kubelet[2721]: I0317 18:00:17.748360 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-policysync\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748529 kubelet[2721]: I0317 18:00:17.748375 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.748529 kubelet[2721]: I0317 18:00:17.748380 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-lib-calico\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748529 kubelet[2721]: I0317 18:00:17.748403 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e803e84a-5e61-46fb-91a6-3411250f4200-node-certs\") pod \"e803e84a-5e61-46fb-91a6-3411250f4200\" (UID: \"e803e84a-5e61-46fb-91a6-3411250f4200\") " Mar 17 18:00:17.748529 kubelet[2721]: I0317 18:00:17.748463 2721 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.748928 kubelet[2721]: I0317 18:00:17.748478 2721 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.748928 kubelet[2721]: I0317 18:00:17.748492 2721 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-lib-modules\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.748928 kubelet[2721]: I0317 18:00:17.748504 2721 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-xtables-lock\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.748928 kubelet[2721]: I0317 18:00:17.748781 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.748599 sshd-session[6098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:17.749279 kubelet[2721]: I0317 18:00:17.748954 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.749279 kubelet[2721]: I0317 18:00:17.748992 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-policysync" (OuterVolumeSpecName: "policysync") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.749279 kubelet[2721]: I0317 18:00:17.749014 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.749279 kubelet[2721]: I0317 18:00:17.749042 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:00:17.752423 kubelet[2721]: I0317 18:00:17.752374 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e803e84a-5e61-46fb-91a6-3411250f4200-node-certs" (OuterVolumeSpecName: "node-certs") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 18:00:17.752830 kubelet[2721]: I0317 18:00:17.752719 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e803e84a-5e61-46fb-91a6-3411250f4200-kube-api-access-5pxfm" (OuterVolumeSpecName: "kube-api-access-5pxfm") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "kube-api-access-5pxfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 18:00:17.754709 kubelet[2721]: I0317 18:00:17.754661 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e803e84a-5e61-46fb-91a6-3411250f4200-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "e803e84a-5e61-46fb-91a6-3411250f4200" (UID: "e803e84a-5e61-46fb-91a6-3411250f4200"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 18:00:17.755842 systemd-logind[1507]: New session 23 of user core. Mar 17 18:00:17.764730 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.544 [INFO][6049] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.631 [INFO][6049] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" iface="eth0" netns="/var/run/netns/cni-099149a8-4876-33f7-2995-760104567d30" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.631 [INFO][6049] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" iface="eth0" netns="/var/run/netns/cni-099149a8-4876-33f7-2995-760104567d30" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.639 [INFO][6049] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" after=8.132307ms iface="eth0" netns="/var/run/netns/cni-099149a8-4876-33f7-2995-760104567d30" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.639 [INFO][6049] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.639 [INFO][6049] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.659 [INFO][6089] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" HandleID="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Workload="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.660 [INFO][6089] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.660 [INFO][6089] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.823 [INFO][6089] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" HandleID="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Workload="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.824 [INFO][6089] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" HandleID="k8s-pod-network.9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Workload="localhost-k8s-calico--kube--controllers--7bbc9d96f6--tnt5q-eth0" Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.825 [INFO][6089] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:00:17.832018 containerd[1527]: 2025-03-17 18:00:17.827 [INFO][6049] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699" Mar 17 18:00:17.832721 containerd[1527]: time="2025-03-17T18:00:17.832310301Z" level=info msg="TearDown network for sandbox \"9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699\" successfully" Mar 17 18:00:17.832721 containerd[1527]: time="2025-03-17T18:00:17.832337583Z" level=info msg="StopPodSandbox for \"9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699\" returns successfully" Mar 17 18:00:17.849669 kubelet[2721]: I0317 18:00:17.849625 2721 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-log-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.849669 kubelet[2721]: I0317 18:00:17.849659 2721 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-policysync\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.849669 kubelet[2721]: I0317 18:00:17.849672 2721 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-lib-calico\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.849900 kubelet[2721]: I0317 18:00:17.849693 2721 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e803e84a-5e61-46fb-91a6-3411250f4200-node-certs\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.849900 kubelet[2721]: I0317 18:00:17.849704 2721 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-cni-net-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.849900 kubelet[2721]: I0317 18:00:17.849713 2721 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e803e84a-5e61-46fb-91a6-3411250f4200-var-run-calico\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.849900 kubelet[2721]: I0317 18:00:17.849723 2721 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-5pxfm\" (UniqueName: \"kubernetes.io/projected/e803e84a-5e61-46fb-91a6-3411250f4200-kube-api-access-5pxfm\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.849900 kubelet[2721]: I0317 18:00:17.849737 2721 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e803e84a-5e61-46fb-91a6-3411250f4200-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:17.948355 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9b036dda0c4c0f8f7ed0792144a41b2d54b51f921beddcf1b7085a6429f81699-rootfs.mount: Deactivated successfully. Mar 17 18:00:17.948471 systemd[1]: run-netns-cni\x2d099149a8\x2d4876\x2d33f7\x2d2995\x2d760104567d30.mount: Deactivated successfully. Mar 17 18:00:17.948548 systemd[1]: var-lib-kubelet-pods-e803e84a\x2d5e61\x2d46fb\x2d91a6\x2d3411250f4200-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 17 18:00:17.948653 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df-rootfs.mount: Deactivated successfully. Mar 17 18:00:17.948744 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-09c462ca4eb5e12131f6a7164d96abb378bb52ebee2f389db88e0fb9051e86df-shm.mount: Deactivated successfully. Mar 17 18:00:17.948843 systemd[1]: var-lib-kubelet-pods-e803e84a\x2d5e61\x2d46fb\x2d91a6\x2d3411250f4200-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5pxfm.mount: Deactivated successfully. Mar 17 18:00:17.948927 systemd[1]: var-lib-kubelet-pods-e803e84a\x2d5e61\x2d46fb\x2d91a6\x2d3411250f4200-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 17 18:00:17.950603 kubelet[2721]: I0317 18:00:17.950553 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m87l2\" (UniqueName: \"kubernetes.io/projected/a88959ff-624d-4e4a-bb08-2634d2121e9c-kube-api-access-m87l2\") pod \"a88959ff-624d-4e4a-bb08-2634d2121e9c\" (UID: \"a88959ff-624d-4e4a-bb08-2634d2121e9c\") " Mar 17 18:00:17.950603 kubelet[2721]: I0317 18:00:17.950604 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88959ff-624d-4e4a-bb08-2634d2121e9c-tigera-ca-bundle\") pod \"a88959ff-624d-4e4a-bb08-2634d2121e9c\" (UID: \"a88959ff-624d-4e4a-bb08-2634d2121e9c\") " Mar 17 18:00:17.953786 kubelet[2721]: I0317 18:00:17.953733 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a88959ff-624d-4e4a-bb08-2634d2121e9c-kube-api-access-m87l2" (OuterVolumeSpecName: "kube-api-access-m87l2") pod "a88959ff-624d-4e4a-bb08-2634d2121e9c" (UID: "a88959ff-624d-4e4a-bb08-2634d2121e9c"). InnerVolumeSpecName "kube-api-access-m87l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 18:00:17.955620 kubelet[2721]: I0317 18:00:17.955390 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a88959ff-624d-4e4a-bb08-2634d2121e9c-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "a88959ff-624d-4e4a-bb08-2634d2121e9c" (UID: "a88959ff-624d-4e4a-bb08-2634d2121e9c"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 18:00:17.955729 systemd[1]: var-lib-kubelet-pods-a88959ff\x2d624d\x2d4e4a\x2dbb08\x2d2634d2121e9c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dm87l2.mount: Deactivated successfully. Mar 17 18:00:17.958359 systemd[1]: var-lib-kubelet-pods-a88959ff\x2d624d\x2d4e4a\x2dbb08\x2d2634d2121e9c-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 17 18:00:18.011498 sshd[6104]: Connection closed by 10.0.0.1 port 37962 Mar 17 18:00:18.011858 sshd-session[6098]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:18.016102 systemd[1]: sshd@22-10.0.0.118:22-10.0.0.1:37962.service: Deactivated successfully. Mar 17 18:00:18.018628 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 18:00:18.019902 systemd-logind[1507]: Session 23 logged out. Waiting for processes to exit. Mar 17 18:00:18.020977 systemd-logind[1507]: Removed session 23. Mar 17 18:00:18.051791 kubelet[2721]: I0317 18:00:18.051742 2721 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-m87l2\" (UniqueName: \"kubernetes.io/projected/a88959ff-624d-4e4a-bb08-2634d2121e9c-kube-api-access-m87l2\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:18.051791 kubelet[2721]: I0317 18:00:18.051774 2721 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88959ff-624d-4e4a-bb08-2634d2121e9c-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:18.246087 kubelet[2721]: I0317 18:00:18.246046 2721 topology_manager.go:215] "Topology Admit Handler" podUID="92ee292f-72aa-4ba1-b670-939631ed26b2" podNamespace="calico-system" podName="calico-node-686qj" Mar 17 18:00:18.251566 kubelet[2721]: E0317 18:00:18.251512 2721 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="e803e84a-5e61-46fb-91a6-3411250f4200" containerName="flexvol-driver" Mar 17 18:00:18.251566 kubelet[2721]: E0317 18:00:18.251547 2721 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="e803e84a-5e61-46fb-91a6-3411250f4200" containerName="install-cni" Mar 17 18:00:18.251566 kubelet[2721]: E0317 18:00:18.251561 2721 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" containerName="calico-kube-controllers" Mar 17 18:00:18.251566 kubelet[2721]: E0317 18:00:18.251571 2721 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="e803e84a-5e61-46fb-91a6-3411250f4200" containerName="calico-node" Mar 17 18:00:18.256660 kubelet[2721]: I0317 18:00:18.256623 2721 memory_manager.go:354] "RemoveStaleState removing state" podUID="e803e84a-5e61-46fb-91a6-3411250f4200" containerName="calico-node" Mar 17 18:00:18.256660 kubelet[2721]: I0317 18:00:18.256642 2721 memory_manager.go:354] "RemoveStaleState removing state" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" containerName="calico-kube-controllers" Mar 17 18:00:18.268702 systemd[1]: Created slice kubepods-besteffort-pod92ee292f_72aa_4ba1_b670_939631ed26b2.slice - libcontainer container kubepods-besteffort-pod92ee292f_72aa_4ba1_b670_939631ed26b2.slice. Mar 17 18:00:18.353263 kubelet[2721]: I0317 18:00:18.353220 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-cni-log-dir\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353382 kubelet[2721]: I0317 18:00:18.353274 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-flexvol-driver-host\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353382 kubelet[2721]: I0317 18:00:18.353298 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-policysync\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353382 kubelet[2721]: I0317 18:00:18.353322 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-xtables-lock\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353382 kubelet[2721]: I0317 18:00:18.353340 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-cni-bin-dir\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353382 kubelet[2721]: I0317 18:00:18.353359 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-cni-net-dir\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353524 kubelet[2721]: I0317 18:00:18.353381 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/92ee292f-72aa-4ba1-b670-939631ed26b2-node-certs\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353524 kubelet[2721]: I0317 18:00:18.353402 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-lib-modules\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353524 kubelet[2721]: I0317 18:00:18.353422 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92ee292f-72aa-4ba1-b670-939631ed26b2-tigera-ca-bundle\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353524 kubelet[2721]: I0317 18:00:18.353441 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-var-lib-calico\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353524 kubelet[2721]: I0317 18:00:18.353464 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/92ee292f-72aa-4ba1-b670-939631ed26b2-var-run-calico\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.353647 kubelet[2721]: I0317 18:00:18.353483 2721 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbwd7\" (UniqueName: \"kubernetes.io/projected/92ee292f-72aa-4ba1-b670-939631ed26b2-kube-api-access-cbwd7\") pod \"calico-node-686qj\" (UID: \"92ee292f-72aa-4ba1-b670-939631ed26b2\") " pod="calico-system/calico-node-686qj" Mar 17 18:00:18.572532 kubelet[2721]: E0317 18:00:18.572390 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:18.573102 containerd[1527]: time="2025-03-17T18:00:18.573037224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-686qj,Uid:92ee292f-72aa-4ba1-b670-939631ed26b2,Namespace:calico-system,Attempt:0,}" Mar 17 18:00:18.655005 kubelet[2721]: I0317 18:00:18.654971 2721 scope.go:117] "RemoveContainer" containerID="4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc" Mar 17 18:00:18.656049 containerd[1527]: time="2025-03-17T18:00:18.656014064Z" level=info msg="RemoveContainer for \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\"" Mar 17 18:00:18.661497 systemd[1]: Removed slice kubepods-besteffort-poda88959ff_624d_4e4a_bb08_2634d2121e9c.slice - libcontainer container kubepods-besteffort-poda88959ff_624d_4e4a_bb08_2634d2121e9c.slice. Mar 17 18:00:18.664260 systemd[1]: Removed slice kubepods-besteffort-pode803e84a_5e61_46fb_91a6_3411250f4200.slice - libcontainer container kubepods-besteffort-pode803e84a_5e61_46fb_91a6_3411250f4200.slice. Mar 17 18:00:18.664359 systemd[1]: kubepods-besteffort-pode803e84a_5e61_46fb_91a6_3411250f4200.slice: Consumed 2.660s CPU time, 217.5M memory peak, 2.1M read from disk, 161M written to disk. Mar 17 18:00:18.675543 containerd[1527]: time="2025-03-17T18:00:18.675424377Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:00:18.675543 containerd[1527]: time="2025-03-17T18:00:18.675493489Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:00:18.675543 containerd[1527]: time="2025-03-17T18:00:18.675507866Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:00:18.676558 containerd[1527]: time="2025-03-17T18:00:18.676481274Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:00:18.699850 systemd[1]: Started cri-containerd-adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3.scope - libcontainer container adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3. Mar 17 18:00:18.706622 containerd[1527]: time="2025-03-17T18:00:18.705127497Z" level=info msg="RemoveContainer for \"4f169e1ddfcedc838c59d85cde3de121af07a052c9a685a9358db418399d00bc\" returns successfully" Mar 17 18:00:18.706765 kubelet[2721]: I0317 18:00:18.705491 2721 scope.go:117] "RemoveContainer" containerID="d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709" Mar 17 18:00:18.706821 containerd[1527]: time="2025-03-17T18:00:18.706793229Z" level=info msg="RemoveContainer for \"d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709\"" Mar 17 18:00:18.724509 containerd[1527]: time="2025-03-17T18:00:18.724355404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-686qj,Uid:92ee292f-72aa-4ba1-b670-939631ed26b2,Namespace:calico-system,Attempt:0,} returns sandbox id \"adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3\"" Mar 17 18:00:18.728559 kubelet[2721]: E0317 18:00:18.728344 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:18.730500 containerd[1527]: time="2025-03-17T18:00:18.730453579Z" level=info msg="CreateContainer within sandbox \"adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 18:00:18.745714 containerd[1527]: time="2025-03-17T18:00:18.745625147Z" level=info msg="RemoveContainer for \"d2c74c3cd2e5f84ebfe033573a681761f2ddb4edcf760c40bfc722fddf6d7709\" returns successfully" Mar 17 18:00:18.745904 kubelet[2721]: I0317 18:00:18.745866 2721 scope.go:117] "RemoveContainer" containerID="6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174" Mar 17 18:00:18.746866 containerd[1527]: time="2025-03-17T18:00:18.746840774Z" level=info msg="RemoveContainer for \"6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174\"" Mar 17 18:00:18.886129 containerd[1527]: time="2025-03-17T18:00:18.885492943Z" level=info msg="RemoveContainer for \"6f8290ead7f1b1e2a6a40feb2f3a1803879955b1c4bcfc70ce940b7fa9be6174\" returns successfully" Mar 17 18:00:19.082317 containerd[1527]: time="2025-03-17T18:00:19.082265066Z" level=info msg="CreateContainer within sandbox \"adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0\"" Mar 17 18:00:19.083600 containerd[1527]: time="2025-03-17T18:00:19.082817565Z" level=info msg="StartContainer for \"a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0\"" Mar 17 18:00:19.113735 systemd[1]: Started cri-containerd-a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0.scope - libcontainer container a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0. Mar 17 18:00:19.181629 containerd[1527]: time="2025-03-17T18:00:19.181553845Z" level=info msg="StartContainer for \"a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0\" returns successfully" Mar 17 18:00:19.206813 systemd[1]: cri-containerd-a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0.scope: Deactivated successfully. Mar 17 18:00:19.207141 systemd[1]: cri-containerd-a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0.scope: Consumed 42ms CPU time, 18.1M memory peak, 9.7M read from disk, 6.3M written to disk. Mar 17 18:00:19.228013 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0-rootfs.mount: Deactivated successfully. Mar 17 18:00:19.255980 containerd[1527]: time="2025-03-17T18:00:19.255900552Z" level=info msg="shim disconnected" id=a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0 namespace=k8s.io Mar 17 18:00:19.255980 containerd[1527]: time="2025-03-17T18:00:19.255955296Z" level=warning msg="cleaning up after shim disconnected" id=a42863dabba0af186fd6da29a73a1780c2d711c7f8e37a2e8f65dbc891e3aff0 namespace=k8s.io Mar 17 18:00:19.255980 containerd[1527]: time="2025-03-17T18:00:19.255965685Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:19.288950 kubelet[2721]: I0317 18:00:19.288895 2721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a88959ff-624d-4e4a-bb08-2634d2121e9c" path="/var/lib/kubelet/pods/a88959ff-624d-4e4a-bb08-2634d2121e9c/volumes" Mar 17 18:00:19.289681 kubelet[2721]: I0317 18:00:19.289643 2721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e803e84a-5e61-46fb-91a6-3411250f4200" path="/var/lib/kubelet/pods/e803e84a-5e61-46fb-91a6-3411250f4200/volumes" Mar 17 18:00:19.659175 kubelet[2721]: E0317 18:00:19.658742 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:19.660839 containerd[1527]: time="2025-03-17T18:00:19.660798474Z" level=info msg="CreateContainer within sandbox \"adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 18:00:20.069287 containerd[1527]: time="2025-03-17T18:00:20.069229975Z" level=info msg="CreateContainer within sandbox \"adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b\"" Mar 17 18:00:20.070744 containerd[1527]: time="2025-03-17T18:00:20.070717753Z" level=info msg="StartContainer for \"42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b\"" Mar 17 18:00:20.107747 systemd[1]: Started cri-containerd-42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b.scope - libcontainer container 42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b. Mar 17 18:00:20.315893 containerd[1527]: time="2025-03-17T18:00:20.315823982Z" level=info msg="StartContainer for \"42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b\" returns successfully" Mar 17 18:00:20.664139 systemd[1]: cri-containerd-42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b.scope: Deactivated successfully. Mar 17 18:00:20.665169 systemd[1]: cri-containerd-42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b.scope: Consumed 692ms CPU time, 188.5M memory peak, 170.2M read from disk. Mar 17 18:00:20.665732 kubelet[2721]: E0317 18:00:20.665527 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:20.687255 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b-rootfs.mount: Deactivated successfully. Mar 17 18:00:20.845220 containerd[1527]: time="2025-03-17T18:00:20.845156927Z" level=info msg="shim disconnected" id=42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b namespace=k8s.io Mar 17 18:00:20.845220 containerd[1527]: time="2025-03-17T18:00:20.845210580Z" level=warning msg="cleaning up after shim disconnected" id=42501894796c801d537101d43aaa0822dc056b55eab25703f035fb0281f8f17b namespace=k8s.io Mar 17 18:00:20.845220 containerd[1527]: time="2025-03-17T18:00:20.845219286Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:21.346448 systemd[1]: cri-containerd-2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282.scope: Deactivated successfully. Mar 17 18:00:21.346868 systemd[1]: cri-containerd-2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282.scope: Consumed 362ms CPU time, 28.5M memory peak, 5.7M read from disk. Mar 17 18:00:21.367611 containerd[1527]: time="2025-03-17T18:00:21.365479983Z" level=info msg="shim disconnected" id=2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282 namespace=k8s.io Mar 17 18:00:21.367611 containerd[1527]: time="2025-03-17T18:00:21.365552280Z" level=warning msg="cleaning up after shim disconnected" id=2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282 namespace=k8s.io Mar 17 18:00:21.367611 containerd[1527]: time="2025-03-17T18:00:21.365562870Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:21.368151 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282-rootfs.mount: Deactivated successfully. Mar 17 18:00:21.387156 containerd[1527]: time="2025-03-17T18:00:21.387114602Z" level=info msg="StopContainer for \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\" returns successfully" Mar 17 18:00:21.387613 containerd[1527]: time="2025-03-17T18:00:21.387572382Z" level=info msg="StopPodSandbox for \"700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0\"" Mar 17 18:00:21.387687 containerd[1527]: time="2025-03-17T18:00:21.387616666Z" level=info msg="Container to stop \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:00:21.390530 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0-shm.mount: Deactivated successfully. Mar 17 18:00:21.395248 systemd[1]: cri-containerd-700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0.scope: Deactivated successfully. Mar 17 18:00:21.413601 containerd[1527]: time="2025-03-17T18:00:21.413526275Z" level=info msg="shim disconnected" id=700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0 namespace=k8s.io Mar 17 18:00:21.413805 containerd[1527]: time="2025-03-17T18:00:21.413779426Z" level=warning msg="cleaning up after shim disconnected" id=700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0 namespace=k8s.io Mar 17 18:00:21.413805 containerd[1527]: time="2025-03-17T18:00:21.413800236Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 18:00:21.417191 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0-rootfs.mount: Deactivated successfully. Mar 17 18:00:21.435235 containerd[1527]: time="2025-03-17T18:00:21.435184900Z" level=info msg="TearDown network for sandbox \"700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0\" successfully" Mar 17 18:00:21.435235 containerd[1527]: time="2025-03-17T18:00:21.435217271Z" level=info msg="StopPodSandbox for \"700d37f1bd3702fad52d726d7a7351a8b5dad2119cc90181c94855eb2e3a3bc0\" returns successfully" Mar 17 18:00:21.475116 kubelet[2721]: I0317 18:00:21.475081 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d254ee3-70bb-43a2-a894-a8d9916f5164-tigera-ca-bundle\") pod \"0d254ee3-70bb-43a2-a894-a8d9916f5164\" (UID: \"0d254ee3-70bb-43a2-a894-a8d9916f5164\") " Mar 17 18:00:21.475315 kubelet[2721]: I0317 18:00:21.475132 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0d254ee3-70bb-43a2-a894-a8d9916f5164-typha-certs\") pod \"0d254ee3-70bb-43a2-a894-a8d9916f5164\" (UID: \"0d254ee3-70bb-43a2-a894-a8d9916f5164\") " Mar 17 18:00:21.475315 kubelet[2721]: I0317 18:00:21.475156 2721 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tclpd\" (UniqueName: \"kubernetes.io/projected/0d254ee3-70bb-43a2-a894-a8d9916f5164-kube-api-access-tclpd\") pod \"0d254ee3-70bb-43a2-a894-a8d9916f5164\" (UID: \"0d254ee3-70bb-43a2-a894-a8d9916f5164\") " Mar 17 18:00:21.478239 kubelet[2721]: I0317 18:00:21.477947 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d254ee3-70bb-43a2-a894-a8d9916f5164-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "0d254ee3-70bb-43a2-a894-a8d9916f5164" (UID: "0d254ee3-70bb-43a2-a894-a8d9916f5164"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 18:00:21.481612 kubelet[2721]: I0317 18:00:21.479694 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d254ee3-70bb-43a2-a894-a8d9916f5164-kube-api-access-tclpd" (OuterVolumeSpecName: "kube-api-access-tclpd") pod "0d254ee3-70bb-43a2-a894-a8d9916f5164" (UID: "0d254ee3-70bb-43a2-a894-a8d9916f5164"). InnerVolumeSpecName "kube-api-access-tclpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 18:00:21.481612 kubelet[2721]: I0317 18:00:21.480250 2721 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d254ee3-70bb-43a2-a894-a8d9916f5164-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "0d254ee3-70bb-43a2-a894-a8d9916f5164" (UID: "0d254ee3-70bb-43a2-a894-a8d9916f5164"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 18:00:21.480220 systemd[1]: var-lib-kubelet-pods-0d254ee3\x2d70bb\x2d43a2\x2da894\x2da8d9916f5164-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtclpd.mount: Deactivated successfully. Mar 17 18:00:21.480337 systemd[1]: var-lib-kubelet-pods-0d254ee3\x2d70bb\x2d43a2\x2da894\x2da8d9916f5164-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 17 18:00:21.483787 systemd[1]: var-lib-kubelet-pods-0d254ee3\x2d70bb\x2d43a2\x2da894\x2da8d9916f5164-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 17 18:00:21.576349 kubelet[2721]: I0317 18:00:21.576297 2721 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d254ee3-70bb-43a2-a894-a8d9916f5164-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:21.576349 kubelet[2721]: I0317 18:00:21.576326 2721 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0d254ee3-70bb-43a2-a894-a8d9916f5164-typha-certs\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:21.576349 kubelet[2721]: I0317 18:00:21.576334 2721 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-tclpd\" (UniqueName: \"kubernetes.io/projected/0d254ee3-70bb-43a2-a894-a8d9916f5164-kube-api-access-tclpd\") on node \"localhost\" DevicePath \"\"" Mar 17 18:00:21.667899 kubelet[2721]: E0317 18:00:21.667437 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:21.673143 kubelet[2721]: I0317 18:00:21.673115 2721 scope.go:117] "RemoveContainer" containerID="2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282" Mar 17 18:00:21.678943 systemd[1]: Removed slice kubepods-besteffort-pod0d254ee3_70bb_43a2_a894_a8d9916f5164.slice - libcontainer container kubepods-besteffort-pod0d254ee3_70bb_43a2_a894_a8d9916f5164.slice. Mar 17 18:00:21.679367 systemd[1]: kubepods-besteffort-pod0d254ee3_70bb_43a2_a894_a8d9916f5164.slice: Consumed 389ms CPU time, 28.8M memory peak, 5.7M read from disk. Mar 17 18:00:21.687595 containerd[1527]: time="2025-03-17T18:00:21.686898885Z" level=info msg="RemoveContainer for \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\"" Mar 17 18:00:21.687595 containerd[1527]: time="2025-03-17T18:00:21.686935365Z" level=info msg="CreateContainer within sandbox \"adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 18:00:21.691648 containerd[1527]: time="2025-03-17T18:00:21.691616465Z" level=info msg="RemoveContainer for \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\" returns successfully" Mar 17 18:00:21.692737 kubelet[2721]: I0317 18:00:21.692712 2721 scope.go:117] "RemoveContainer" containerID="2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282" Mar 17 18:00:21.693049 containerd[1527]: time="2025-03-17T18:00:21.692916027Z" level=error msg="ContainerStatus for \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\": not found" Mar 17 18:00:21.693685 kubelet[2721]: E0317 18:00:21.693199 2721 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\": not found" containerID="2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282" Mar 17 18:00:21.693685 kubelet[2721]: I0317 18:00:21.693258 2721 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282"} err="failed to get container status \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\": rpc error: code = NotFound desc = an error occurred when try to find container \"2aaff2b0d17ec12c8210096f21c7b73c20c290873a8f5610c64d2aed559ac282\": not found" Mar 17 18:00:21.704288 containerd[1527]: time="2025-03-17T18:00:21.704235391Z" level=info msg="CreateContainer within sandbox \"adfc760eaecaadec502a0c3ca3eb939839031bb27c9ac996c354d3f748aba5c3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f431704f6be62d1a18ff9aadb3ccc15447c3b245886b191d31f2d552d8bb7c25\"" Mar 17 18:00:21.704842 containerd[1527]: time="2025-03-17T18:00:21.704813490Z" level=info msg="StartContainer for \"f431704f6be62d1a18ff9aadb3ccc15447c3b245886b191d31f2d552d8bb7c25\"" Mar 17 18:00:21.739744 systemd[1]: Started cri-containerd-f431704f6be62d1a18ff9aadb3ccc15447c3b245886b191d31f2d552d8bb7c25.scope - libcontainer container f431704f6be62d1a18ff9aadb3ccc15447c3b245886b191d31f2d552d8bb7c25. Mar 17 18:00:21.772338 containerd[1527]: time="2025-03-17T18:00:21.772287105Z" level=info msg="StartContainer for \"f431704f6be62d1a18ff9aadb3ccc15447c3b245886b191d31f2d552d8bb7c25\" returns successfully" Mar 17 18:00:22.678032 kubelet[2721]: E0317 18:00:22.678002 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:23.040735 systemd[1]: Started sshd@23-10.0.0.118:22-10.0.0.1:37976.service - OpenSSH per-connection server daemon (10.0.0.1:37976). Mar 17 18:00:23.088154 sshd[6534]: Accepted publickey for core from 10.0.0.1 port 37976 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:23.089947 sshd-session[6534]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:23.094640 systemd-logind[1507]: New session 24 of user core. Mar 17 18:00:23.103722 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 17 18:00:23.217645 sshd[6539]: Connection closed by 10.0.0.1 port 37976 Mar 17 18:00:23.217993 sshd-session[6534]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:23.222360 systemd[1]: sshd@23-10.0.0.118:22-10.0.0.1:37976.service: Deactivated successfully. Mar 17 18:00:23.224836 systemd[1]: session-24.scope: Deactivated successfully. Mar 17 18:00:23.225489 systemd-logind[1507]: Session 24 logged out. Waiting for processes to exit. Mar 17 18:00:23.226305 systemd-logind[1507]: Removed session 24. Mar 17 18:00:23.288272 kubelet[2721]: I0317 18:00:23.288231 2721 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d254ee3-70bb-43a2-a894-a8d9916f5164" path="/var/lib/kubelet/pods/0d254ee3-70bb-43a2-a894-a8d9916f5164/volumes" Mar 17 18:00:23.679722 kubelet[2721]: E0317 18:00:23.679673 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:28.234387 systemd[1]: Started sshd@24-10.0.0.118:22-10.0.0.1:52968.service - OpenSSH per-connection server daemon (10.0.0.1:52968). Mar 17 18:00:28.281182 sshd[6761]: Accepted publickey for core from 10.0.0.1 port 52968 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:28.283099 sshd-session[6761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:28.287706 systemd-logind[1507]: New session 25 of user core. Mar 17 18:00:28.297762 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 17 18:00:28.409799 sshd[6763]: Connection closed by 10.0.0.1 port 52968 Mar 17 18:00:28.410245 sshd-session[6761]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:28.413920 systemd[1]: sshd@24-10.0.0.118:22-10.0.0.1:52968.service: Deactivated successfully. Mar 17 18:00:28.416330 systemd[1]: session-25.scope: Deactivated successfully. Mar 17 18:00:28.417147 systemd-logind[1507]: Session 25 logged out. Waiting for processes to exit. Mar 17 18:00:28.418062 systemd-logind[1507]: Removed session 25. Mar 17 18:00:29.288258 kubelet[2721]: E0317 18:00:29.288211 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:33.425193 systemd[1]: Started sshd@25-10.0.0.118:22-10.0.0.1:52974.service - OpenSSH per-connection server daemon (10.0.0.1:52974). Mar 17 18:00:33.486690 sshd[6784]: Accepted publickey for core from 10.0.0.1 port 52974 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:33.488634 sshd-session[6784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:33.494024 systemd-logind[1507]: New session 26 of user core. Mar 17 18:00:33.507799 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 17 18:00:33.631259 sshd[6786]: Connection closed by 10.0.0.1 port 52974 Mar 17 18:00:33.631732 sshd-session[6784]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:33.635898 systemd[1]: sshd@25-10.0.0.118:22-10.0.0.1:52974.service: Deactivated successfully. Mar 17 18:00:33.639041 systemd[1]: session-26.scope: Deactivated successfully. Mar 17 18:00:33.640075 systemd-logind[1507]: Session 26 logged out. Waiting for processes to exit. Mar 17 18:00:33.641286 systemd-logind[1507]: Removed session 26. Mar 17 18:00:35.286900 kubelet[2721]: E0317 18:00:35.286854 2721 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:00:38.674892 systemd[1]: Started sshd@26-10.0.0.118:22-10.0.0.1:42366.service - OpenSSH per-connection server daemon (10.0.0.1:42366). Mar 17 18:00:38.728559 sshd[6807]: Accepted publickey for core from 10.0.0.1 port 42366 ssh2: RSA SHA256:fvq/EnOzAjyVAI7Ny/Y8iSI7Zce+5eYVas+A6dENwjM Mar 17 18:00:38.731348 sshd-session[6807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 18:00:38.744268 systemd-logind[1507]: New session 27 of user core. Mar 17 18:00:38.752365 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 17 18:00:38.904695 sshd[6809]: Connection closed by 10.0.0.1 port 42366 Mar 17 18:00:38.906180 sshd-session[6807]: pam_unix(sshd:session): session closed for user core Mar 17 18:00:38.911593 systemd[1]: sshd@26-10.0.0.118:22-10.0.0.1:42366.service: Deactivated successfully. Mar 17 18:00:38.914300 systemd[1]: session-27.scope: Deactivated successfully. Mar 17 18:00:38.916098 systemd-logind[1507]: Session 27 logged out. Waiting for processes to exit. Mar 17 18:00:38.917307 systemd-logind[1507]: Removed session 27.