Mar 17 17:29:24.959701 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 17 17:29:24.959721 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT Mon Mar 17 16:05:23 -00 2025 Mar 17 17:29:24.959730 kernel: KASLR enabled Mar 17 17:29:24.959736 kernel: efi: EFI v2.7 by EDK II Mar 17 17:29:24.959742 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdbbbf018 ACPI 2.0=0xd9b43018 RNG=0xd9b43a18 MEMRESERVE=0xd9b40d98 Mar 17 17:29:24.959747 kernel: random: crng init done Mar 17 17:29:24.959754 kernel: secureboot: Secure boot disabled Mar 17 17:29:24.959760 kernel: ACPI: Early table checksum verification disabled Mar 17 17:29:24.959766 kernel: ACPI: RSDP 0x00000000D9B43018 000024 (v02 BOCHS ) Mar 17 17:29:24.959773 kernel: ACPI: XSDT 0x00000000D9B43F18 000064 (v01 BOCHS BXPC 00000001 01000013) Mar 17 17:29:24.959779 kernel: ACPI: FACP 0x00000000D9B43B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959785 kernel: ACPI: DSDT 0x00000000D9B41018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959790 kernel: ACPI: APIC 0x00000000D9B43C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959796 kernel: ACPI: PPTT 0x00000000D9B43098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959803 kernel: ACPI: GTDT 0x00000000D9B43818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959811 kernel: ACPI: MCFG 0x00000000D9B43A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959817 kernel: ACPI: SPCR 0x00000000D9B43918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959823 kernel: ACPI: DBG2 0x00000000D9B43998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959829 kernel: ACPI: IORT 0x00000000D9B43198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:29:24.959835 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Mar 17 17:29:24.959842 kernel: NUMA: Failed to initialise from firmware Mar 17 17:29:24.959848 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Mar 17 17:29:24.959854 kernel: NUMA: NODE_DATA [mem 0xdc958800-0xdc95dfff] Mar 17 17:29:24.959860 kernel: Zone ranges: Mar 17 17:29:24.959919 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Mar 17 17:29:24.959928 kernel: DMA32 empty Mar 17 17:29:24.959934 kernel: Normal empty Mar 17 17:29:24.959941 kernel: Movable zone start for each node Mar 17 17:29:24.959947 kernel: Early memory node ranges Mar 17 17:29:24.959953 kernel: node 0: [mem 0x0000000040000000-0x00000000d976ffff] Mar 17 17:29:24.959959 kernel: node 0: [mem 0x00000000d9770000-0x00000000d9b3ffff] Mar 17 17:29:24.959965 kernel: node 0: [mem 0x00000000d9b40000-0x00000000dce1ffff] Mar 17 17:29:24.959971 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Mar 17 17:29:24.959977 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Mar 17 17:29:24.959983 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Mar 17 17:29:24.959989 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Mar 17 17:29:24.959995 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Mar 17 17:29:24.960003 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Mar 17 17:29:24.960009 kernel: psci: probing for conduit method from ACPI. Mar 17 17:29:24.960015 kernel: psci: PSCIv1.1 detected in firmware. Mar 17 17:29:24.960024 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 17:29:24.960031 kernel: psci: Trusted OS migration not required Mar 17 17:29:24.960038 kernel: psci: SMC Calling Convention v1.1 Mar 17 17:29:24.960046 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 17 17:29:24.960053 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 17 17:29:24.960059 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 17 17:29:24.960066 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 17 17:29:24.960072 kernel: Detected PIPT I-cache on CPU0 Mar 17 17:29:24.960079 kernel: CPU features: detected: GIC system register CPU interface Mar 17 17:29:24.960086 kernel: CPU features: detected: Hardware dirty bit management Mar 17 17:29:24.960092 kernel: CPU features: detected: Spectre-v4 Mar 17 17:29:24.960099 kernel: CPU features: detected: Spectre-BHB Mar 17 17:29:24.960105 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 17 17:29:24.960113 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 17 17:29:24.960119 kernel: CPU features: detected: ARM erratum 1418040 Mar 17 17:29:24.960126 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 17 17:29:24.960132 kernel: alternatives: applying boot alternatives Mar 17 17:29:24.960140 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:29:24.960147 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:29:24.960153 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:29:24.960160 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:29:24.960166 kernel: Fallback order for Node 0: 0 Mar 17 17:29:24.960173 kernel: Built 1 zonelists, mobility grouping on. Total pages: 633024 Mar 17 17:29:24.960180 kernel: Policy zone: DMA Mar 17 17:29:24.960187 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:29:24.960194 kernel: software IO TLB: area num 4. Mar 17 17:29:24.960200 kernel: software IO TLB: mapped [mem 0x00000000d2e00000-0x00000000d6e00000] (64MB) Mar 17 17:29:24.960207 kernel: Memory: 2386260K/2572288K available (10240K kernel code, 2186K rwdata, 8100K rodata, 39744K init, 897K bss, 186028K reserved, 0K cma-reserved) Mar 17 17:29:24.960214 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 17 17:29:24.960221 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:29:24.960228 kernel: rcu: RCU event tracing is enabled. Mar 17 17:29:24.960234 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 17 17:29:24.960241 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:29:24.960248 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:29:24.960261 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:29:24.960268 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 17 17:29:24.960277 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 17:29:24.960283 kernel: GICv3: 256 SPIs implemented Mar 17 17:29:24.960290 kernel: GICv3: 0 Extended SPIs implemented Mar 17 17:29:24.960296 kernel: Root IRQ handler: gic_handle_irq Mar 17 17:29:24.960303 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 17 17:29:24.960313 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 17 17:29:24.960321 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 17 17:29:24.960330 kernel: ITS@0x0000000008080000: allocated 8192 Devices @400c0000 (indirect, esz 8, psz 64K, shr 1) Mar 17 17:29:24.960338 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @400d0000 (flat, esz 8, psz 64K, shr 1) Mar 17 17:29:24.960349 kernel: GICv3: using LPI property table @0x00000000400f0000 Mar 17 17:29:24.960357 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040100000 Mar 17 17:29:24.960368 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:29:24.960376 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:29:24.960385 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 17 17:29:24.960392 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 17 17:29:24.960401 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 17 17:29:24.960410 kernel: arm-pv: using stolen time PV Mar 17 17:29:24.960419 kernel: Console: colour dummy device 80x25 Mar 17 17:29:24.960427 kernel: ACPI: Core revision 20230628 Mar 17 17:29:24.960435 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 17 17:29:24.960442 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:29:24.960450 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:29:24.960456 kernel: landlock: Up and running. Mar 17 17:29:24.960463 kernel: SELinux: Initializing. Mar 17 17:29:24.960470 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:29:24.960477 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:29:24.960483 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 17 17:29:24.960490 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 17 17:29:24.960497 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:29:24.960504 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:29:24.960512 kernel: Platform MSI: ITS@0x8080000 domain created Mar 17 17:29:24.960519 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 17 17:29:24.960525 kernel: Remapping and enabling EFI services. Mar 17 17:29:24.960532 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:29:24.960539 kernel: Detected PIPT I-cache on CPU1 Mar 17 17:29:24.960545 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 17 17:29:24.960552 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040110000 Mar 17 17:29:24.960559 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:29:24.960566 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 17 17:29:24.960573 kernel: Detected PIPT I-cache on CPU2 Mar 17 17:29:24.960581 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 17 17:29:24.960588 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040120000 Mar 17 17:29:24.960599 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:29:24.960607 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 17 17:29:24.960615 kernel: Detected PIPT I-cache on CPU3 Mar 17 17:29:24.960622 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 17 17:29:24.960629 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040130000 Mar 17 17:29:24.960636 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:29:24.960643 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 17 17:29:24.960651 kernel: smp: Brought up 1 node, 4 CPUs Mar 17 17:29:24.960658 kernel: SMP: Total of 4 processors activated. Mar 17 17:29:24.960665 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 17:29:24.960672 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 17 17:29:24.960680 kernel: CPU features: detected: Common not Private translations Mar 17 17:29:24.960687 kernel: CPU features: detected: CRC32 instructions Mar 17 17:29:24.960694 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 17 17:29:24.960701 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 17 17:29:24.960709 kernel: CPU features: detected: LSE atomic instructions Mar 17 17:29:24.960716 kernel: CPU features: detected: Privileged Access Never Mar 17 17:29:24.960724 kernel: CPU features: detected: RAS Extension Support Mar 17 17:29:24.960731 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 17 17:29:24.960738 kernel: CPU: All CPU(s) started at EL1 Mar 17 17:29:24.960745 kernel: alternatives: applying system-wide alternatives Mar 17 17:29:24.960752 kernel: devtmpfs: initialized Mar 17 17:29:24.960759 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:29:24.960766 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 17 17:29:24.960775 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:29:24.960782 kernel: SMBIOS 3.0.0 present. Mar 17 17:29:24.960789 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Mar 17 17:29:24.960796 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:29:24.960803 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 17:29:24.960810 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 17:29:24.960818 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 17:29:24.960825 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:29:24.960832 kernel: audit: type=2000 audit(0.019:1): state=initialized audit_enabled=0 res=1 Mar 17 17:29:24.960840 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:29:24.960847 kernel: cpuidle: using governor menu Mar 17 17:29:24.960855 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 17:29:24.960862 kernel: ASID allocator initialised with 32768 entries Mar 17 17:29:24.960876 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:29:24.960883 kernel: Serial: AMBA PL011 UART driver Mar 17 17:29:24.960890 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 17 17:29:24.960898 kernel: Modules: 0 pages in range for non-PLT usage Mar 17 17:29:24.960905 kernel: Modules: 508944 pages in range for PLT usage Mar 17 17:29:24.960914 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:29:24.960921 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:29:24.960928 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 17:29:24.960936 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 17 17:29:24.960943 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:29:24.960950 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:29:24.960957 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 17:29:24.960964 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 17 17:29:24.960972 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:29:24.960980 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:29:24.960988 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:29:24.960995 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:29:24.961002 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:29:24.961009 kernel: ACPI: Interpreter enabled Mar 17 17:29:24.961017 kernel: ACPI: Using GIC for interrupt routing Mar 17 17:29:24.961024 kernel: ACPI: MCFG table detected, 1 entries Mar 17 17:29:24.961031 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 17 17:29:24.961038 kernel: printk: console [ttyAMA0] enabled Mar 17 17:29:24.961047 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 17 17:29:24.961182 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 17:29:24.961265 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 17 17:29:24.961338 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 17 17:29:24.961404 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 17 17:29:24.961468 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 17 17:29:24.961478 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 17 17:29:24.961489 kernel: PCI host bridge to bus 0000:00 Mar 17 17:29:24.961561 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 17 17:29:24.961621 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 17 17:29:24.961681 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 17 17:29:24.961739 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 17 17:29:24.961822 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 17 17:29:24.961917 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 Mar 17 17:29:24.961994 kernel: pci 0000:00:01.0: reg 0x10: [io 0x0000-0x001f] Mar 17 17:29:24.962064 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x10000000-0x10000fff] Mar 17 17:29:24.962134 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 17 17:29:24.962201 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 17 17:29:24.962276 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x10000000-0x10000fff] Mar 17 17:29:24.962347 kernel: pci 0000:00:01.0: BAR 0: assigned [io 0x1000-0x101f] Mar 17 17:29:24.962409 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 17 17:29:24.962471 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 17 17:29:24.962531 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 17 17:29:24.962540 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 17 17:29:24.962548 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 17 17:29:24.962555 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 17 17:29:24.962563 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 17 17:29:24.962570 kernel: iommu: Default domain type: Translated Mar 17 17:29:24.962578 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 17:29:24.962588 kernel: efivars: Registered efivars operations Mar 17 17:29:24.962595 kernel: vgaarb: loaded Mar 17 17:29:24.962602 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 17:29:24.962610 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:29:24.962617 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:29:24.962625 kernel: pnp: PnP ACPI init Mar 17 17:29:24.962700 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 17 17:29:24.962710 kernel: pnp: PnP ACPI: found 1 devices Mar 17 17:29:24.962719 kernel: NET: Registered PF_INET protocol family Mar 17 17:29:24.962727 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:29:24.962735 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:29:24.962743 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:29:24.962750 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:29:24.962758 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:29:24.962765 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:29:24.962772 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:29:24.962780 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:29:24.962789 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:29:24.962796 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:29:24.962803 kernel: kvm [1]: HYP mode not available Mar 17 17:29:24.962811 kernel: Initialise system trusted keyrings Mar 17 17:29:24.962818 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:29:24.962825 kernel: Key type asymmetric registered Mar 17 17:29:24.962833 kernel: Asymmetric key parser 'x509' registered Mar 17 17:29:24.962840 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 17 17:29:24.962848 kernel: io scheduler mq-deadline registered Mar 17 17:29:24.962857 kernel: io scheduler kyber registered Mar 17 17:29:24.962934 kernel: io scheduler bfq registered Mar 17 17:29:24.962944 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 17 17:29:24.962951 kernel: ACPI: button: Power Button [PWRB] Mar 17 17:29:24.962959 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 17 17:29:24.963038 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Mar 17 17:29:24.963048 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:29:24.963055 kernel: thunder_xcv, ver 1.0 Mar 17 17:29:24.963063 kernel: thunder_bgx, ver 1.0 Mar 17 17:29:24.963073 kernel: nicpf, ver 1.0 Mar 17 17:29:24.963081 kernel: nicvf, ver 1.0 Mar 17 17:29:24.963155 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 17:29:24.963218 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T17:29:24 UTC (1742232564) Mar 17 17:29:24.963227 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 17:29:24.963235 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 17 17:29:24.963242 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 17 17:29:24.963249 kernel: watchdog: Hard watchdog permanently disabled Mar 17 17:29:24.963267 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:29:24.963274 kernel: Segment Routing with IPv6 Mar 17 17:29:24.963281 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:29:24.963289 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:29:24.963296 kernel: Key type dns_resolver registered Mar 17 17:29:24.963303 kernel: registered taskstats version 1 Mar 17 17:29:24.963311 kernel: Loading compiled-in X.509 certificates Mar 17 17:29:24.963318 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 74c9b4f5dfad711856d7363c976664fc02c1e24c' Mar 17 17:29:24.963325 kernel: Key type .fscrypt registered Mar 17 17:29:24.963334 kernel: Key type fscrypt-provisioning registered Mar 17 17:29:24.963342 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:29:24.963349 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:29:24.963356 kernel: ima: No architecture policies found Mar 17 17:29:24.963363 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 17:29:24.963371 kernel: clk: Disabling unused clocks Mar 17 17:29:24.963378 kernel: Freeing unused kernel memory: 39744K Mar 17 17:29:24.963385 kernel: Run /init as init process Mar 17 17:29:24.963393 kernel: with arguments: Mar 17 17:29:24.963401 kernel: /init Mar 17 17:29:24.963408 kernel: with environment: Mar 17 17:29:24.963416 kernel: HOME=/ Mar 17 17:29:24.963423 kernel: TERM=linux Mar 17 17:29:24.963430 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:29:24.963439 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:29:24.963448 systemd[1]: Detected virtualization kvm. Mar 17 17:29:24.963456 systemd[1]: Detected architecture arm64. Mar 17 17:29:24.963465 systemd[1]: Running in initrd. Mar 17 17:29:24.963473 systemd[1]: No hostname configured, using default hostname. Mar 17 17:29:24.963480 systemd[1]: Hostname set to . Mar 17 17:29:24.963488 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:29:24.963496 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:29:24.963504 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:29:24.963512 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:29:24.963520 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:29:24.963530 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:29:24.963537 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:29:24.963546 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:29:24.963555 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:29:24.963563 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:29:24.963571 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:29:24.963579 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:29:24.963589 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:29:24.963597 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:29:24.963604 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:29:24.963612 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:29:24.963620 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:29:24.963628 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:29:24.963636 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:29:24.963646 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:29:24.963662 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:29:24.963670 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:29:24.963678 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:29:24.963686 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:29:24.963695 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:29:24.963703 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:29:24.963711 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:29:24.963719 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:29:24.963727 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:29:24.963736 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:29:24.963744 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:29:24.963752 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:29:24.963760 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:29:24.963768 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:29:24.963777 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:29:24.963786 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:29:24.963812 systemd-journald[238]: Collecting audit messages is disabled. Mar 17 17:29:24.963833 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:29:24.963841 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:29:24.963850 systemd-journald[238]: Journal started Mar 17 17:29:24.963881 systemd-journald[238]: Runtime Journal (/run/log/journal/53a8b9d5b9df4474855c8c9be5266e97) is 5.9M, max 47.3M, 41.4M free. Mar 17 17:29:24.956484 systemd-modules-load[240]: Inserted module 'overlay' Mar 17 17:29:24.969523 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:29:24.969570 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:29:24.972133 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:29:24.972178 kernel: Bridge firewalling registered Mar 17 17:29:24.972111 systemd-modules-load[240]: Inserted module 'br_netfilter' Mar 17 17:29:24.973378 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:29:24.977128 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:29:24.979388 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:29:24.981634 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:29:24.984180 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:29:24.987729 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:29:24.988645 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:29:24.990221 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:29:24.993531 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:29:25.002184 dracut-cmdline[273]: dracut-dracut-053 Mar 17 17:29:25.004979 dracut-cmdline[273]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=31b104f73129b84fa679201ebe02fbfd197d071bbf0576d6ccc5c5442bcbb405 Mar 17 17:29:25.025687 systemd-resolved[276]: Positive Trust Anchors: Mar 17 17:29:25.025774 systemd-resolved[276]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:29:25.025807 systemd-resolved[276]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:29:25.031714 systemd-resolved[276]: Defaulting to hostname 'linux'. Mar 17 17:29:25.033022 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:29:25.033951 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:29:25.081879 kernel: SCSI subsystem initialized Mar 17 17:29:25.085888 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:29:25.092933 kernel: iscsi: registered transport (tcp) Mar 17 17:29:25.105883 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:29:25.105915 kernel: QLogic iSCSI HBA Driver Mar 17 17:29:25.149633 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:29:25.160039 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:29:25.177716 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:29:25.177783 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:29:25.177808 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:29:25.226896 kernel: raid6: neonx8 gen() 15766 MB/s Mar 17 17:29:25.243889 kernel: raid6: neonx4 gen() 15638 MB/s Mar 17 17:29:25.260884 kernel: raid6: neonx2 gen() 13183 MB/s Mar 17 17:29:25.277883 kernel: raid6: neonx1 gen() 10489 MB/s Mar 17 17:29:25.294882 kernel: raid6: int64x8 gen() 6963 MB/s Mar 17 17:29:25.311881 kernel: raid6: int64x4 gen() 7346 MB/s Mar 17 17:29:25.328881 kernel: raid6: int64x2 gen() 6127 MB/s Mar 17 17:29:25.345888 kernel: raid6: int64x1 gen() 5049 MB/s Mar 17 17:29:25.345903 kernel: raid6: using algorithm neonx8 gen() 15766 MB/s Mar 17 17:29:25.362885 kernel: raid6: .... xor() 11920 MB/s, rmw enabled Mar 17 17:29:25.362899 kernel: raid6: using neon recovery algorithm Mar 17 17:29:25.368114 kernel: xor: measuring software checksum speed Mar 17 17:29:25.368132 kernel: 8regs : 19797 MB/sec Mar 17 17:29:25.369221 kernel: 32regs : 19272 MB/sec Mar 17 17:29:25.369238 kernel: arm64_neon : 26945 MB/sec Mar 17 17:29:25.369260 kernel: xor: using function: arm64_neon (26945 MB/sec) Mar 17 17:29:25.420892 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:29:25.432360 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:29:25.440041 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:29:25.452062 systemd-udevd[460]: Using default interface naming scheme 'v255'. Mar 17 17:29:25.455351 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:29:25.471116 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:29:25.482535 dracut-pre-trigger[467]: rd.md=0: removing MD RAID activation Mar 17 17:29:25.510161 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:29:25.522013 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:29:25.562917 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:29:25.573074 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:29:25.587040 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:29:25.588639 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:29:25.590187 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:29:25.592372 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:29:25.601089 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:29:25.617332 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Mar 17 17:29:25.623606 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 17 17:29:25.623712 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 17 17:29:25.623731 kernel: GPT:9289727 != 19775487 Mar 17 17:29:25.623740 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 17 17:29:25.623749 kernel: GPT:9289727 != 19775487 Mar 17 17:29:25.623759 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 17 17:29:25.623768 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:29:25.617631 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:29:25.624288 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:29:25.624398 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:29:25.626966 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:29:25.629325 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:29:25.629472 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:29:25.631364 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:29:25.638067 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:29:25.648306 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:29:25.654089 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (520) Mar 17 17:29:25.654111 kernel: BTRFS: device fsid c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 devid 1 transid 40 /dev/vda3 scanned by (udev-worker) (509) Mar 17 17:29:25.657101 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 17 17:29:25.664254 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 17 17:29:25.671245 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 17 17:29:25.675038 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 17 17:29:25.676202 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 17 17:29:25.695022 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:29:25.696919 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:29:25.702853 disk-uuid[551]: Primary Header is updated. Mar 17 17:29:25.702853 disk-uuid[551]: Secondary Entries is updated. Mar 17 17:29:25.702853 disk-uuid[551]: Secondary Header is updated. Mar 17 17:29:25.709898 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:29:25.719610 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:29:26.719630 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:29:26.719706 disk-uuid[552]: The operation has completed successfully. Mar 17 17:29:26.747735 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:29:26.747831 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:29:26.762040 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:29:26.766129 sh[571]: Success Mar 17 17:29:26.780878 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 17:29:26.813622 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:29:26.830284 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:29:26.831791 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:29:26.842055 kernel: BTRFS info (device dm-0): first mount of filesystem c0c482e3-6885-4a4e-b31c-6bc8f8c403e7 Mar 17 17:29:26.842108 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:29:26.842128 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:29:26.843383 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:29:26.843398 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:29:26.847360 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:29:26.848631 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:29:26.858018 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:29:26.859601 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:29:26.867478 kernel: BTRFS info (device vda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:29:26.867519 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:29:26.867531 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:29:26.869920 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:29:26.877100 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:29:26.878766 kernel: BTRFS info (device vda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:29:26.884186 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:29:26.889028 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:29:26.958106 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:29:26.971222 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:29:26.980637 ignition[656]: Ignition 2.20.0 Mar 17 17:29:26.980647 ignition[656]: Stage: fetch-offline Mar 17 17:29:26.980681 ignition[656]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:29:26.980689 ignition[656]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:29:26.980838 ignition[656]: parsed url from cmdline: "" Mar 17 17:29:26.980841 ignition[656]: no config URL provided Mar 17 17:29:26.980845 ignition[656]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:29:26.980852 ignition[656]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:29:26.980888 ignition[656]: op(1): [started] loading QEMU firmware config module Mar 17 17:29:26.980892 ignition[656]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 17 17:29:26.989179 ignition[656]: op(1): [finished] loading QEMU firmware config module Mar 17 17:29:26.999738 systemd-networkd[765]: lo: Link UP Mar 17 17:29:26.999750 systemd-networkd[765]: lo: Gained carrier Mar 17 17:29:27.000462 systemd-networkd[765]: Enumeration completed Mar 17 17:29:27.000842 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:29:27.000845 systemd-networkd[765]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:29:27.002423 systemd-networkd[765]: eth0: Link UP Mar 17 17:29:27.002426 systemd-networkd[765]: eth0: Gained carrier Mar 17 17:29:27.002432 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:29:27.003852 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:29:27.005230 systemd[1]: Reached target network.target - Network. Mar 17 17:29:27.019906 systemd-networkd[765]: eth0: DHCPv4 address 10.0.0.79/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 17:29:27.036175 ignition[656]: parsing config with SHA512: ca0cc090cf19cb67a3c7156ebef92f6b8e73298266d34cf58882751411be678c6dc4794abef7ecb91b3b7fc7c2bc910dd3e94362c4feeba51bc35c83b84659c1 Mar 17 17:29:27.040942 unknown[656]: fetched base config from "system" Mar 17 17:29:27.040955 unknown[656]: fetched user config from "qemu" Mar 17 17:29:27.042469 ignition[656]: fetch-offline: fetch-offline passed Mar 17 17:29:27.042583 ignition[656]: Ignition finished successfully Mar 17 17:29:27.044192 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:29:27.045224 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 17 17:29:27.053026 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:29:27.063531 ignition[772]: Ignition 2.20.0 Mar 17 17:29:27.063542 ignition[772]: Stage: kargs Mar 17 17:29:27.063691 ignition[772]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:29:27.063704 ignition[772]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:29:27.064596 ignition[772]: kargs: kargs passed Mar 17 17:29:27.064637 ignition[772]: Ignition finished successfully Mar 17 17:29:27.066598 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:29:27.074051 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:29:27.082776 ignition[781]: Ignition 2.20.0 Mar 17 17:29:27.082786 ignition[781]: Stage: disks Mar 17 17:29:27.082956 ignition[781]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:29:27.082965 ignition[781]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:29:27.086030 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:29:27.083859 ignition[781]: disks: disks passed Mar 17 17:29:27.087112 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:29:27.083924 ignition[781]: Ignition finished successfully Mar 17 17:29:27.088627 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:29:27.090121 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:29:27.091695 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:29:27.093069 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:29:27.111074 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:29:27.120915 systemd-fsck[793]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 17 17:29:27.125278 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:29:27.134069 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:29:27.172882 kernel: EXT4-fs (vda9): mounted filesystem 6b579bf2-7716-4d59-98eb-b92ea668693e r/w with ordered data mode. Quota mode: none. Mar 17 17:29:27.173150 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:29:27.174176 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:29:27.187940 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:29:27.189486 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:29:27.190252 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 17 17:29:27.190288 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:29:27.190308 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:29:27.196251 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:29:27.201291 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (802) Mar 17 17:29:27.199928 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:29:27.204682 kernel: BTRFS info (device vda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:29:27.204718 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:29:27.204729 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:29:27.206885 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:29:27.207824 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:29:27.246717 initrd-setup-root[827]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:29:27.250996 initrd-setup-root[834]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:29:27.255092 initrd-setup-root[841]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:29:27.258746 initrd-setup-root[848]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:29:27.329594 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:29:27.340968 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:29:27.342430 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:29:27.347881 kernel: BTRFS info (device vda6): last unmount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:29:27.366819 ignition[915]: INFO : Ignition 2.20.0 Mar 17 17:29:27.366819 ignition[915]: INFO : Stage: mount Mar 17 17:29:27.368214 ignition[915]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:29:27.368214 ignition[915]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:29:27.369758 ignition[915]: INFO : mount: mount passed Mar 17 17:29:27.369758 ignition[915]: INFO : Ignition finished successfully Mar 17 17:29:27.370665 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:29:27.372904 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:29:27.383973 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:29:27.841120 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:29:27.857062 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:29:27.862888 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (929) Mar 17 17:29:27.864461 kernel: BTRFS info (device vda6): first mount of filesystem 3dbd9b64-bd31-4292-be10-51551993b53f Mar 17 17:29:27.864476 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:29:27.864487 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:29:27.866884 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:29:27.867894 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:29:27.885332 ignition[946]: INFO : Ignition 2.20.0 Mar 17 17:29:27.885332 ignition[946]: INFO : Stage: files Mar 17 17:29:27.886605 ignition[946]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:29:27.886605 ignition[946]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:29:27.886605 ignition[946]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:29:27.889295 ignition[946]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:29:27.889295 ignition[946]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:29:27.889295 ignition[946]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:29:27.889295 ignition[946]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:29:27.893364 ignition[946]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:29:27.893364 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 17:29:27.893364 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 17:29:27.893364 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:29:27.893364 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 17 17:29:27.889542 unknown[946]: wrote ssh authorized keys file for user: core Mar 17 17:29:27.935652 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Mar 17 17:29:28.103832 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:29:28.103832 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:29:28.106636 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Mar 17 17:29:28.353046 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Mar 17 17:29:28.432588 systemd-networkd[765]: eth0: Gained IPv6LL Mar 17 17:29:28.704905 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 17:29:28.704905 ignition[946]: INFO : files: op(c): [started] processing unit "containerd.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(c): op(d): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(c): op(d): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(c): [finished] processing unit "containerd.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(e): [started] processing unit "prepare-helm.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(e): op(f): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(e): op(f): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(e): [finished] processing unit "prepare-helm.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(10): [started] processing unit "coreos-metadata.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(10): op(11): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(10): op(11): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(10): [finished] processing unit "coreos-metadata.service" Mar 17 17:29:28.708620 ignition[946]: INFO : files: op(12): [started] setting preset to disabled for "coreos-metadata.service" Mar 17 17:29:28.736906 ignition[946]: INFO : files: op(12): op(13): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:29:28.740943 ignition[946]: INFO : files: op(12): op(13): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:29:28.743559 ignition[946]: INFO : files: op(12): [finished] setting preset to disabled for "coreos-metadata.service" Mar 17 17:29:28.743559 ignition[946]: INFO : files: op(14): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:29:28.743559 ignition[946]: INFO : files: op(14): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:29:28.743559 ignition[946]: INFO : files: createResultFile: createFiles: op(15): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:29:28.743559 ignition[946]: INFO : files: createResultFile: createFiles: op(15): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:29:28.743559 ignition[946]: INFO : files: files passed Mar 17 17:29:28.743559 ignition[946]: INFO : Ignition finished successfully Mar 17 17:29:28.744120 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:29:28.760070 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:29:28.762315 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:29:28.763999 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:29:28.764093 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:29:28.769425 initrd-setup-root-after-ignition[974]: grep: /sysroot/oem/oem-release: No such file or directory Mar 17 17:29:28.772116 initrd-setup-root-after-ignition[976]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:29:28.772116 initrd-setup-root-after-ignition[976]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:29:28.775407 initrd-setup-root-after-ignition[980]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:29:28.774058 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:29:28.777216 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:29:28.787014 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:29:28.809453 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:29:28.809597 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:29:28.811699 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:29:28.813364 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:29:28.814937 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:29:28.815723 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:29:28.830926 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:29:28.841010 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:29:28.848541 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:29:28.849511 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:29:28.851295 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:29:28.852817 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:29:28.852948 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:29:28.855275 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:29:28.857014 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:29:28.858444 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:29:28.859884 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:29:28.861571 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:29:28.863273 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:29:28.864830 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:29:28.866506 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:29:28.868210 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:29:28.869666 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:29:28.870987 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:29:28.871106 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:29:28.873294 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:29:28.874944 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:29:28.876713 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:29:28.879930 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:29:28.880846 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:29:28.880979 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:29:28.883603 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:29:28.883710 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:29:28.885569 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:29:28.886943 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:29:28.892967 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:29:28.893965 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:29:28.895825 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:29:28.897240 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:29:28.897331 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:29:28.898613 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:29:28.898687 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:29:28.900016 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:29:28.900123 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:29:28.901637 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:29:28.901734 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:29:28.914038 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:29:28.915574 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:29:28.916345 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:29:28.916456 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:29:28.918335 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:29:28.918431 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:29:28.923078 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:29:28.923173 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:29:28.929605 ignition[1002]: INFO : Ignition 2.20.0 Mar 17 17:29:28.929605 ignition[1002]: INFO : Stage: umount Mar 17 17:29:28.929605 ignition[1002]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:29:28.929605 ignition[1002]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:29:28.929605 ignition[1002]: INFO : umount: umount passed Mar 17 17:29:28.929605 ignition[1002]: INFO : Ignition finished successfully Mar 17 17:29:28.931286 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:29:28.931756 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:29:28.931843 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:29:28.933818 systemd[1]: Stopped target network.target - Network. Mar 17 17:29:28.935281 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:29:28.935348 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:29:28.936665 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:29:28.936704 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:29:28.938205 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:29:28.938256 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:29:28.939769 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:29:28.939813 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:29:28.941589 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:29:28.943021 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:29:28.949905 systemd-networkd[765]: eth0: DHCPv6 lease lost Mar 17 17:29:28.951444 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:29:28.951594 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:29:28.953021 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:29:28.953133 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:29:28.955642 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:29:28.955683 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:29:28.970992 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:29:28.971732 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:29:28.971797 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:29:28.973694 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:29:28.973740 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:29:28.975412 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:29:28.975450 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:29:28.979494 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:29:28.979546 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:29:28.981892 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:29:28.985243 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:29:28.985331 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:29:28.994725 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:29:28.994821 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:29:28.996654 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:29:28.997923 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:29:28.998918 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:29:28.999038 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:29:29.000839 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:29:29.000907 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:29:29.002373 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:29:29.002403 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:29:29.003659 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:29:29.003700 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:29:29.005646 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:29:29.005687 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:29:29.007612 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:29:29.007659 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:29:29.021037 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:29:29.021826 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:29:29.021904 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:29:29.023636 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 17 17:29:29.023680 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:29:29.025207 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:29:29.025254 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:29:29.026849 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:29:29.026903 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:29:29.029071 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:29:29.029156 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:29:29.031409 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:29:29.033009 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:29:29.043204 systemd[1]: Switching root. Mar 17 17:29:29.072496 systemd-journald[238]: Journal stopped Mar 17 17:29:29.825613 systemd-journald[238]: Received SIGTERM from PID 1 (systemd). Mar 17 17:29:29.825677 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 17:29:29.825693 kernel: SELinux: policy capability open_perms=1 Mar 17 17:29:29.825703 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 17:29:29.825712 kernel: SELinux: policy capability always_check_network=0 Mar 17 17:29:29.825722 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 17:29:29.825731 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 17:29:29.825741 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 17:29:29.825750 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 17:29:29.825760 kernel: audit: type=1403 audit(1742232569.250:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 17:29:29.825770 systemd[1]: Successfully loaded SELinux policy in 30.229ms. Mar 17 17:29:29.825792 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.063ms. Mar 17 17:29:29.825803 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 17 17:29:29.825814 systemd[1]: Detected virtualization kvm. Mar 17 17:29:29.825829 systemd[1]: Detected architecture arm64. Mar 17 17:29:29.825839 systemd[1]: Detected first boot. Mar 17 17:29:29.825850 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:29:29.825860 zram_generator::config[1063]: No configuration found. Mar 17 17:29:29.825891 systemd[1]: Populated /etc with preset unit settings. Mar 17 17:29:29.825904 systemd[1]: Queued start job for default target multi-user.target. Mar 17 17:29:29.825914 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 17 17:29:29.825925 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 17:29:29.825936 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 17:29:29.825947 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 17:29:29.825957 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 17:29:29.825968 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 17:29:29.825978 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 17:29:29.825990 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 17:29:29.826000 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 17:29:29.826011 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:29:29.826021 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:29:29.826032 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 17:29:29.826042 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 17:29:29.826052 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 17:29:29.826063 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:29:29.826073 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 17 17:29:29.826086 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:29:29.826096 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 17:29:29.826107 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:29:29.826117 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:29:29.826127 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:29:29.826138 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:29:29.826148 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 17:29:29.826158 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 17:29:29.826170 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:29:29.826181 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 17 17:29:29.826191 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:29:29.826201 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:29:29.826216 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:29:29.826228 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 17:29:29.826238 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 17:29:29.826248 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 17:29:29.826258 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 17:29:29.826269 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 17:29:29.826281 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 17:29:29.826291 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 17:29:29.826302 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 17:29:29.826312 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:29:29.826323 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:29:29.826333 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 17:29:29.826344 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:29:29.826365 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:29:29.826377 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:29:29.826387 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 17:29:29.826397 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:29:29.826408 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 17:29:29.826418 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Mar 17 17:29:29.826429 systemd[1]: systemd-journald.service: (This warning is only shown for the first unit using IP firewalling.) Mar 17 17:29:29.826440 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:29:29.826450 kernel: fuse: init (API version 7.39) Mar 17 17:29:29.826460 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:29:29.826471 kernel: loop: module loaded Mar 17 17:29:29.826481 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 17:29:29.826492 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 17:29:29.826502 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:29:29.826512 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 17:29:29.826522 kernel: ACPI: bus type drm_connector registered Mar 17 17:29:29.826536 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 17:29:29.826547 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 17:29:29.826561 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 17:29:29.826573 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 17:29:29.826584 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 17:29:29.826595 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:29:29.826623 systemd-journald[1145]: Collecting audit messages is disabled. Mar 17 17:29:29.826646 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 17:29:29.826656 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 17:29:29.826668 systemd-journald[1145]: Journal started Mar 17 17:29:29.826691 systemd-journald[1145]: Runtime Journal (/run/log/journal/53a8b9d5b9df4474855c8c9be5266e97) is 5.9M, max 47.3M, 41.4M free. Mar 17 17:29:29.828892 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:29:29.829560 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 17:29:29.830669 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:29:29.830844 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:29:29.832101 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:29:29.832282 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:29:29.833543 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:29:29.833712 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:29:29.834977 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 17:29:29.835138 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 17:29:29.836186 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:29:29.836444 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:29:29.837746 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:29:29.838977 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 17:29:29.840499 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 17:29:29.853680 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 17:29:29.872005 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 17:29:29.874269 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 17:29:29.875144 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 17:29:29.877560 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 17:29:29.879638 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 17:29:29.880615 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:29:29.882021 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 17:29:29.884012 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:29:29.887123 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:29:29.888519 systemd-journald[1145]: Time spent on flushing to /var/log/journal/53a8b9d5b9df4474855c8c9be5266e97 is 19.250ms for 845 entries. Mar 17 17:29:29.888519 systemd-journald[1145]: System Journal (/var/log/journal/53a8b9d5b9df4474855c8c9be5266e97) is 8.0M, max 195.6M, 187.6M free. Mar 17 17:29:29.918161 systemd-journald[1145]: Received client request to flush runtime journal. Mar 17 17:29:29.889829 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:29:29.892534 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:29:29.893812 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 17:29:29.894990 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 17:29:29.903019 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 17:29:29.904397 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 17:29:29.906390 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 17:29:29.919464 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:29:29.925936 systemd-tmpfiles[1198]: ACLs are not supported, ignoring. Mar 17 17:29:29.925950 systemd-tmpfiles[1198]: ACLs are not supported, ignoring. Mar 17 17:29:29.926267 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 17:29:29.928381 udevadm[1205]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 17 17:29:29.930290 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:29:29.937112 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 17:29:29.958665 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 17:29:29.970076 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:29:29.982293 systemd-tmpfiles[1219]: ACLs are not supported, ignoring. Mar 17 17:29:29.982313 systemd-tmpfiles[1219]: ACLs are not supported, ignoring. Mar 17 17:29:29.986235 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:29:30.351882 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 17:29:30.363045 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:29:30.388389 systemd-udevd[1225]: Using default interface naming scheme 'v255'. Mar 17 17:29:30.405384 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:29:30.418049 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:29:30.426046 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 17:29:30.439788 systemd[1]: Found device dev-ttyAMA0.device - /dev/ttyAMA0. Mar 17 17:29:30.449899 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (1229) Mar 17 17:29:30.477416 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 17 17:29:30.505036 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 17:29:30.550295 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:29:30.557085 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 17:29:30.559953 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 17:29:30.576005 systemd-networkd[1232]: lo: Link UP Mar 17 17:29:30.576015 systemd-networkd[1232]: lo: Gained carrier Mar 17 17:29:30.576884 systemd-networkd[1232]: Enumeration completed Mar 17 17:29:30.578491 lvm[1261]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:29:30.578854 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:29:30.579043 systemd-networkd[1232]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:29:30.579054 systemd-networkd[1232]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:29:30.579698 systemd-networkd[1232]: eth0: Link UP Mar 17 17:29:30.579707 systemd-networkd[1232]: eth0: Gained carrier Mar 17 17:29:30.579721 systemd-networkd[1232]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:29:30.587058 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 17:29:30.595847 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:29:30.596946 systemd-networkd[1232]: eth0: DHCPv4 address 10.0.0.79/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 17:29:30.600451 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 17:29:30.601671 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:29:30.624044 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 17:29:30.627787 lvm[1271]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:29:30.656402 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 17:29:30.657551 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:29:30.658519 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 17:29:30.658549 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:29:30.659395 systemd[1]: Reached target machines.target - Containers. Mar 17 17:29:30.661109 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 17 17:29:30.673044 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 17:29:30.675229 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 17:29:30.676092 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:29:30.677064 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 17:29:30.679339 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 17 17:29:30.682076 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 17:29:30.686665 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 17:29:30.695272 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 17:29:30.699888 kernel: loop0: detected capacity change from 0 to 113536 Mar 17 17:29:30.705559 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 17:29:30.706492 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 17 17:29:30.713897 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 17:29:30.749886 kernel: loop1: detected capacity change from 0 to 194096 Mar 17 17:29:30.791063 kernel: loop2: detected capacity change from 0 to 116808 Mar 17 17:29:30.829889 kernel: loop3: detected capacity change from 0 to 113536 Mar 17 17:29:30.837891 kernel: loop4: detected capacity change from 0 to 194096 Mar 17 17:29:30.846892 kernel: loop5: detected capacity change from 0 to 116808 Mar 17 17:29:30.858512 (sd-merge)[1292]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 17 17:29:30.858950 (sd-merge)[1292]: Merged extensions into '/usr'. Mar 17 17:29:30.862618 systemd[1]: Reloading requested from client PID 1279 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 17:29:30.862632 systemd[1]: Reloading... Mar 17 17:29:30.902904 zram_generator::config[1320]: No configuration found. Mar 17 17:29:30.946221 ldconfig[1276]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 17:29:31.006118 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:29:31.049290 systemd[1]: Reloading finished in 186 ms. Mar 17 17:29:31.069964 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 17:29:31.071441 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 17:29:31.091048 systemd[1]: Starting ensure-sysext.service... Mar 17 17:29:31.093158 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:29:31.098469 systemd[1]: Reloading requested from client PID 1361 ('systemctl') (unit ensure-sysext.service)... Mar 17 17:29:31.098487 systemd[1]: Reloading... Mar 17 17:29:31.111641 systemd-tmpfiles[1362]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 17:29:31.111951 systemd-tmpfiles[1362]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 17:29:31.112648 systemd-tmpfiles[1362]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 17:29:31.113060 systemd-tmpfiles[1362]: ACLs are not supported, ignoring. Mar 17 17:29:31.113124 systemd-tmpfiles[1362]: ACLs are not supported, ignoring. Mar 17 17:29:31.115339 systemd-tmpfiles[1362]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:29:31.115352 systemd-tmpfiles[1362]: Skipping /boot Mar 17 17:29:31.122608 systemd-tmpfiles[1362]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:29:31.122625 systemd-tmpfiles[1362]: Skipping /boot Mar 17 17:29:31.153930 zram_generator::config[1404]: No configuration found. Mar 17 17:29:31.227984 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:29:31.271471 systemd[1]: Reloading finished in 172 ms. Mar 17 17:29:31.286898 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:29:31.301277 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:29:31.303728 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 17:29:31.306341 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 17:29:31.309510 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:29:31.314113 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 17:29:31.318990 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:29:31.320645 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:29:31.324274 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:29:31.329412 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:29:31.330756 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:29:31.331496 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:29:31.331648 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:29:31.341528 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:29:31.341698 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:29:31.343386 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:29:31.343535 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:29:31.347065 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 17:29:31.351651 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 17:29:31.356749 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:29:31.365189 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:29:31.370236 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:29:31.376113 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:29:31.380180 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:29:31.382377 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:29:31.384087 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 17:29:31.386941 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 17:29:31.388411 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:29:31.388559 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:29:31.389829 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:29:31.389984 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:29:31.391259 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:29:31.391399 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:29:31.392853 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:29:31.395817 augenrules[1482]: No rules Mar 17 17:29:31.396196 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:29:31.400628 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:29:31.401074 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:29:31.402231 systemd-resolved[1437]: Positive Trust Anchors: Mar 17 17:29:31.404058 systemd-resolved[1437]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:29:31.404092 systemd-resolved[1437]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:29:31.405424 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 17:29:31.407743 systemd[1]: Finished ensure-sysext.service. Mar 17 17:29:31.409787 systemd-resolved[1437]: Defaulting to hostname 'linux'. Mar 17 17:29:31.412776 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:29:31.412851 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:29:31.422220 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 17 17:29:31.423133 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 17:29:31.423348 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:29:31.424458 systemd[1]: Reached target network.target - Network. Mar 17 17:29:31.425223 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:29:31.465716 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 17 17:29:31.466543 systemd-timesyncd[1499]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 17 17:29:31.466589 systemd-timesyncd[1499]: Initial clock synchronization to Mon 2025-03-17 17:29:31.333814 UTC. Mar 17 17:29:31.467049 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:29:31.467878 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 17:29:31.468743 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 17:29:31.469679 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 17:29:31.470603 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 17:29:31.470635 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:29:31.471303 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 17:29:31.472147 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 17:29:31.473011 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 17:29:31.473891 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:29:31.475303 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 17:29:31.477616 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 17:29:31.479475 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 17:29:31.484752 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 17:29:31.485587 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:29:31.486317 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:29:31.487121 systemd[1]: System is tainted: cgroupsv1 Mar 17 17:29:31.487167 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:29:31.487189 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:29:31.488434 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 17:29:31.490402 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 17:29:31.492271 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 17:29:31.496042 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 17:29:31.496849 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 17:29:31.499034 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 17:29:31.502949 jq[1505]: false Mar 17 17:29:31.505742 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 17 17:29:31.511033 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 17:29:31.516114 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 17:29:31.518981 extend-filesystems[1507]: Found loop3 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found loop4 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found loop5 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda1 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda2 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda3 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found usr Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda4 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda6 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda7 Mar 17 17:29:31.520556 extend-filesystems[1507]: Found vda9 Mar 17 17:29:31.520556 extend-filesystems[1507]: Checking size of /dev/vda9 Mar 17 17:29:31.519652 dbus-daemon[1504]: [system] SELinux support is enabled Mar 17 17:29:31.521148 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 17:29:31.528437 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 17:29:31.532058 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 17:29:31.534780 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 17:29:31.537966 extend-filesystems[1507]: Resized partition /dev/vda9 Mar 17 17:29:31.536236 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 17:29:31.542729 jq[1530]: true Mar 17 17:29:31.543245 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 17:29:31.543512 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 17:29:31.543765 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 17:29:31.544019 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 17:29:31.548519 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 17:29:31.548736 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 17:29:31.565903 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (1245) Mar 17 17:29:31.568193 (ntainerd)[1538]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 17:29:31.576872 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 17 17:29:31.576921 jq[1537]: true Mar 17 17:29:31.577086 extend-filesystems[1532]: resize2fs 1.47.1 (20-May-2024) Mar 17 17:29:31.578368 tar[1535]: linux-arm64/helm Mar 17 17:29:31.582480 systemd-logind[1523]: Watching system buttons on /dev/input/event0 (Power Button) Mar 17 17:29:31.586789 systemd-logind[1523]: New seat seat0. Mar 17 17:29:31.590700 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 17:29:31.598600 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 17:29:31.598745 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 17:29:31.601601 update_engine[1526]: I20250317 17:29:31.601281 1526 main.cc:92] Flatcar Update Engine starting Mar 17 17:29:31.602134 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 17:29:31.602266 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 17:29:31.613916 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 17 17:29:31.615491 systemd[1]: Started update-engine.service - Update Engine. Mar 17 17:29:31.632134 update_engine[1526]: I20250317 17:29:31.615515 1526 update_check_scheduler.cc:74] Next update check in 9m20s Mar 17 17:29:31.617149 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 17:29:31.626254 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 17:29:31.634775 extend-filesystems[1532]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 17 17:29:31.634775 extend-filesystems[1532]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 17 17:29:31.634775 extend-filesystems[1532]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 17 17:29:31.643220 extend-filesystems[1507]: Resized filesystem in /dev/vda9 Mar 17 17:29:31.636454 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 17:29:31.636694 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 17:29:31.662684 bash[1564]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:29:31.663260 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 17:29:31.665579 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 17 17:29:31.683343 locksmithd[1565]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 17:29:31.804043 containerd[1538]: time="2025-03-17T17:29:31.803912920Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 17:29:31.823961 systemd-networkd[1232]: eth0: Gained IPv6LL Mar 17 17:29:31.832901 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 17:29:31.834287 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 17:29:31.834808 containerd[1538]: time="2025-03-17T17:29:31.834774760Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:29:31.836546 containerd[1538]: time="2025-03-17T17:29:31.836514160Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:29:31.836624 containerd[1538]: time="2025-03-17T17:29:31.836611080Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 17:29:31.836676 containerd[1538]: time="2025-03-17T17:29:31.836663920Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 17:29:31.836906 containerd[1538]: time="2025-03-17T17:29:31.836885560Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 17:29:31.836972 containerd[1538]: time="2025-03-17T17:29:31.836959400Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837077 containerd[1538]: time="2025-03-17T17:29:31.837061000Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837143 containerd[1538]: time="2025-03-17T17:29:31.837129440Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837399 containerd[1538]: time="2025-03-17T17:29:31.837376040Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837460 containerd[1538]: time="2025-03-17T17:29:31.837448320Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837513 containerd[1538]: time="2025-03-17T17:29:31.837500840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837556 containerd[1538]: time="2025-03-17T17:29:31.837545760Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837688 containerd[1538]: time="2025-03-17T17:29:31.837673600Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:29:31.837952 containerd[1538]: time="2025-03-17T17:29:31.837932160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:29:31.838138 containerd[1538]: time="2025-03-17T17:29:31.838120160Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:29:31.838222 containerd[1538]: time="2025-03-17T17:29:31.838206600Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 17:29:31.838367 containerd[1538]: time="2025-03-17T17:29:31.838350880Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 17:29:31.838463 containerd[1538]: time="2025-03-17T17:29:31.838449880Z" level=info msg="metadata content store policy set" policy=shared Mar 17 17:29:31.842430 containerd[1538]: time="2025-03-17T17:29:31.842402760Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 17:29:31.842537 containerd[1538]: time="2025-03-17T17:29:31.842524840Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 17:29:31.842677 containerd[1538]: time="2025-03-17T17:29:31.842661720Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 17:29:31.842749 containerd[1538]: time="2025-03-17T17:29:31.842737480Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 17:29:31.842831 containerd[1538]: time="2025-03-17T17:29:31.842817640Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 17:29:31.843032 containerd[1538]: time="2025-03-17T17:29:31.843013200Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 17:29:31.843573 containerd[1538]: time="2025-03-17T17:29:31.843552480Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 17:29:31.843796 containerd[1538]: time="2025-03-17T17:29:31.843775640Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 17:29:31.843948 containerd[1538]: time="2025-03-17T17:29:31.843930400Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 17:29:31.844061 containerd[1538]: time="2025-03-17T17:29:31.844046120Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 17:29:31.844127 containerd[1538]: time="2025-03-17T17:29:31.844107360Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.844163 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 17 17:29:31.844640 containerd[1538]: time="2025-03-17T17:29:31.844499080Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.844640 containerd[1538]: time="2025-03-17T17:29:31.844523960Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.844640 containerd[1538]: time="2025-03-17T17:29:31.844540560Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.844640 containerd[1538]: time="2025-03-17T17:29:31.844555120Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.844640 containerd[1538]: time="2025-03-17T17:29:31.844572440Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.844640 containerd[1538]: time="2025-03-17T17:29:31.844584640Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.844854 containerd[1538]: time="2025-03-17T17:29:31.844793880Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.844927120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.844950120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.844963520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.844975200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.844986400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845000480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845011520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845023760Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845036200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845050240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845063000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845074400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845159 containerd[1538]: time="2025-03-17T17:29:31.845086240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845667 containerd[1538]: time="2025-03-17T17:29:31.845100920Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 17:29:31.845667 containerd[1538]: time="2025-03-17T17:29:31.845509360Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845667 containerd[1538]: time="2025-03-17T17:29:31.845526400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.845667 containerd[1538]: time="2025-03-17T17:29:31.845544400Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 17:29:31.845967 containerd[1538]: time="2025-03-17T17:29:31.845947800Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 17:29:31.846086 containerd[1538]: time="2025-03-17T17:29:31.846025640Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 17:29:31.846390 containerd[1538]: time="2025-03-17T17:29:31.846129160Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 17:29:31.846390 containerd[1538]: time="2025-03-17T17:29:31.846150600Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 17:29:31.846390 containerd[1538]: time="2025-03-17T17:29:31.846163160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.846390 containerd[1538]: time="2025-03-17T17:29:31.846176280Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 17:29:31.846390 containerd[1538]: time="2025-03-17T17:29:31.846185520Z" level=info msg="NRI interface is disabled by configuration." Mar 17 17:29:31.846390 containerd[1538]: time="2025-03-17T17:29:31.846206520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 17:29:31.846547 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:31.847111 containerd[1538]: time="2025-03-17T17:29:31.847056160Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 17:29:31.847324 containerd[1538]: time="2025-03-17T17:29:31.847269120Z" level=info msg="Connect containerd service" Mar 17 17:29:31.847509 containerd[1538]: time="2025-03-17T17:29:31.847480280Z" level=info msg="using legacy CRI server" Mar 17 17:29:31.847509 containerd[1538]: time="2025-03-17T17:29:31.847509800Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 17:29:31.848614 containerd[1538]: time="2025-03-17T17:29:31.848015960Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 17:29:31.848886 containerd[1538]: time="2025-03-17T17:29:31.848842200Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:29:31.850113 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 17:29:31.852978 containerd[1538]: time="2025-03-17T17:29:31.852836760Z" level=info msg="Start subscribing containerd event" Mar 17 17:29:31.853333 containerd[1538]: time="2025-03-17T17:29:31.853297200Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 17:29:31.853384 containerd[1538]: time="2025-03-17T17:29:31.853363120Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 17:29:31.853999 containerd[1538]: time="2025-03-17T17:29:31.853975200Z" level=info msg="Start recovering state" Mar 17 17:29:31.856494 containerd[1538]: time="2025-03-17T17:29:31.856403640Z" level=info msg="Start event monitor" Mar 17 17:29:31.856579 containerd[1538]: time="2025-03-17T17:29:31.856560280Z" level=info msg="Start snapshots syncer" Mar 17 17:29:31.856678 containerd[1538]: time="2025-03-17T17:29:31.856664160Z" level=info msg="Start cni network conf syncer for default" Mar 17 17:29:31.856732 containerd[1538]: time="2025-03-17T17:29:31.856720960Z" level=info msg="Start streaming server" Mar 17 17:29:31.857169 containerd[1538]: time="2025-03-17T17:29:31.857153200Z" level=info msg="containerd successfully booted in 0.056546s" Mar 17 17:29:31.857986 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 17:29:31.880969 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 17 17:29:31.881238 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 17 17:29:31.882433 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 17:29:31.894338 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 17:29:32.002162 tar[1535]: linux-arm64/LICENSE Mar 17 17:29:32.002162 tar[1535]: linux-arm64/README.md Mar 17 17:29:32.013244 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 17 17:29:32.037466 sshd_keygen[1528]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 17:29:32.056298 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 17:29:32.066175 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 17:29:32.072674 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 17:29:32.072929 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 17:29:32.075527 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 17:29:32.087024 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 17:29:32.089587 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 17:29:32.091603 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 17 17:29:32.092670 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 17:29:32.359347 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:32.360570 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 17:29:32.361669 systemd[1]: Startup finished in 5.108s (kernel) + 3.141s (userspace) = 8.249s. Mar 17 17:29:32.364012 (kubelet)[1641]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:29:32.814155 kubelet[1641]: E0317 17:29:32.814046 1641 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:29:32.816625 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:29:32.816808 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:29:37.167592 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 17:29:37.185086 systemd[1]: Started sshd@0-10.0.0.79:22-10.0.0.1:33974.service - OpenSSH per-connection server daemon (10.0.0.1:33974). Mar 17 17:29:37.237133 sshd[1655]: Accepted publickey for core from 10.0.0.1 port 33974 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:29:37.238536 sshd-session[1655]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:37.253167 systemd-logind[1523]: New session 1 of user core. Mar 17 17:29:37.254109 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 17:29:37.261065 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 17:29:37.272417 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 17:29:37.274963 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 17:29:37.283516 (systemd)[1661]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 17:29:37.370481 systemd[1661]: Queued start job for default target default.target. Mar 17 17:29:37.370910 systemd[1661]: Created slice app.slice - User Application Slice. Mar 17 17:29:37.370936 systemd[1661]: Reached target paths.target - Paths. Mar 17 17:29:37.370948 systemd[1661]: Reached target timers.target - Timers. Mar 17 17:29:37.382987 systemd[1661]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 17:29:37.388933 systemd[1661]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 17:29:37.389007 systemd[1661]: Reached target sockets.target - Sockets. Mar 17 17:29:37.389020 systemd[1661]: Reached target basic.target - Basic System. Mar 17 17:29:37.389062 systemd[1661]: Reached target default.target - Main User Target. Mar 17 17:29:37.389089 systemd[1661]: Startup finished in 100ms. Mar 17 17:29:37.389461 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 17:29:37.391408 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 17:29:37.446128 systemd[1]: Started sshd@1-10.0.0.79:22-10.0.0.1:33988.service - OpenSSH per-connection server daemon (10.0.0.1:33988). Mar 17 17:29:37.481707 sshd[1673]: Accepted publickey for core from 10.0.0.1 port 33988 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:29:37.482957 sshd-session[1673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:37.487536 systemd-logind[1523]: New session 2 of user core. Mar 17 17:29:37.500141 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 17:29:37.551638 sshd[1676]: Connection closed by 10.0.0.1 port 33988 Mar 17 17:29:37.552163 sshd-session[1673]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:37.562126 systemd[1]: Started sshd@2-10.0.0.79:22-10.0.0.1:33992.service - OpenSSH per-connection server daemon (10.0.0.1:33992). Mar 17 17:29:37.562514 systemd[1]: sshd@1-10.0.0.79:22-10.0.0.1:33988.service: Deactivated successfully. Mar 17 17:29:37.564257 systemd-logind[1523]: Session 2 logged out. Waiting for processes to exit. Mar 17 17:29:37.564902 systemd[1]: session-2.scope: Deactivated successfully. Mar 17 17:29:37.566308 systemd-logind[1523]: Removed session 2. Mar 17 17:29:37.599468 sshd[1678]: Accepted publickey for core from 10.0.0.1 port 33992 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:29:37.601325 sshd-session[1678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:37.605482 systemd-logind[1523]: New session 3 of user core. Mar 17 17:29:37.615140 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 17:29:37.662915 sshd[1684]: Connection closed by 10.0.0.1 port 33992 Mar 17 17:29:37.663764 sshd-session[1678]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:37.674095 systemd[1]: Started sshd@3-10.0.0.79:22-10.0.0.1:34008.service - OpenSSH per-connection server daemon (10.0.0.1:34008). Mar 17 17:29:37.674464 systemd[1]: sshd@2-10.0.0.79:22-10.0.0.1:33992.service: Deactivated successfully. Mar 17 17:29:37.676872 systemd[1]: session-3.scope: Deactivated successfully. Mar 17 17:29:37.677093 systemd-logind[1523]: Session 3 logged out. Waiting for processes to exit. Mar 17 17:29:37.678158 systemd-logind[1523]: Removed session 3. Mar 17 17:29:37.709763 sshd[1686]: Accepted publickey for core from 10.0.0.1 port 34008 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:29:37.711240 sshd-session[1686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:37.715051 systemd-logind[1523]: New session 4 of user core. Mar 17 17:29:37.726156 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 17:29:37.777476 sshd[1692]: Connection closed by 10.0.0.1 port 34008 Mar 17 17:29:37.777369 sshd-session[1686]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:37.786103 systemd[1]: Started sshd@4-10.0.0.79:22-10.0.0.1:34010.service - OpenSSH per-connection server daemon (10.0.0.1:34010). Mar 17 17:29:37.786488 systemd[1]: sshd@3-10.0.0.79:22-10.0.0.1:34008.service: Deactivated successfully. Mar 17 17:29:37.788283 systemd-logind[1523]: Session 4 logged out. Waiting for processes to exit. Mar 17 17:29:37.788955 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 17:29:37.790498 systemd-logind[1523]: Removed session 4. Mar 17 17:29:37.822082 sshd[1694]: Accepted publickey for core from 10.0.0.1 port 34010 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:29:37.823314 sshd-session[1694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:37.827485 systemd-logind[1523]: New session 5 of user core. Mar 17 17:29:37.842170 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 17:29:37.903949 sudo[1701]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 17:29:37.904240 sudo[1701]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:37.928825 sudo[1701]: pam_unix(sudo:session): session closed for user root Mar 17 17:29:37.930882 sshd[1700]: Connection closed by 10.0.0.1 port 34010 Mar 17 17:29:37.930760 sshd-session[1694]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:37.940190 systemd[1]: Started sshd@5-10.0.0.79:22-10.0.0.1:34014.service - OpenSSH per-connection server daemon (10.0.0.1:34014). Mar 17 17:29:37.940593 systemd[1]: sshd@4-10.0.0.79:22-10.0.0.1:34010.service: Deactivated successfully. Mar 17 17:29:37.942438 systemd-logind[1523]: Session 5 logged out. Waiting for processes to exit. Mar 17 17:29:37.943042 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 17:29:37.944531 systemd-logind[1523]: Removed session 5. Mar 17 17:29:37.976589 sshd[1703]: Accepted publickey for core from 10.0.0.1 port 34014 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:29:37.977770 sshd-session[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:37.981925 systemd-logind[1523]: New session 6 of user core. Mar 17 17:29:37.993157 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 17:29:38.044036 sudo[1711]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 17:29:38.044320 sudo[1711]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:38.047632 sudo[1711]: pam_unix(sudo:session): session closed for user root Mar 17 17:29:38.052418 sudo[1710]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 17:29:38.052693 sudo[1710]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:38.076212 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:29:38.100346 augenrules[1733]: No rules Mar 17 17:29:38.101671 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:29:38.102013 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:29:38.103961 sudo[1710]: pam_unix(sudo:session): session closed for user root Mar 17 17:29:38.105217 sshd[1709]: Connection closed by 10.0.0.1 port 34014 Mar 17 17:29:38.105651 sshd-session[1703]: pam_unix(sshd:session): session closed for user core Mar 17 17:29:38.123184 systemd[1]: Started sshd@6-10.0.0.79:22-10.0.0.1:34022.service - OpenSSH per-connection server daemon (10.0.0.1:34022). Mar 17 17:29:38.123668 systemd[1]: sshd@5-10.0.0.79:22-10.0.0.1:34014.service: Deactivated successfully. Mar 17 17:29:38.125288 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 17:29:38.125870 systemd-logind[1523]: Session 6 logged out. Waiting for processes to exit. Mar 17 17:29:38.127598 systemd-logind[1523]: Removed session 6. Mar 17 17:29:38.159251 sshd[1739]: Accepted publickey for core from 10.0.0.1 port 34022 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:29:38.160546 sshd-session[1739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:29:38.164640 systemd-logind[1523]: New session 7 of user core. Mar 17 17:29:38.176166 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 17:29:38.227272 sudo[1746]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 17:29:38.227564 sudo[1746]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:29:38.557134 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 17 17:29:38.557370 (dockerd)[1766]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 17 17:29:38.828498 dockerd[1766]: time="2025-03-17T17:29:38.828354769Z" level=info msg="Starting up" Mar 17 17:29:39.094621 dockerd[1766]: time="2025-03-17T17:29:39.094520741Z" level=info msg="Loading containers: start." Mar 17 17:29:39.234891 kernel: Initializing XFRM netlink socket Mar 17 17:29:39.304308 systemd-networkd[1232]: docker0: Link UP Mar 17 17:29:39.342424 dockerd[1766]: time="2025-03-17T17:29:39.342382413Z" level=info msg="Loading containers: done." Mar 17 17:29:39.358463 dockerd[1766]: time="2025-03-17T17:29:39.358348530Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 17:29:39.358596 dockerd[1766]: time="2025-03-17T17:29:39.358464451Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Mar 17 17:29:39.358596 dockerd[1766]: time="2025-03-17T17:29:39.358585617Z" level=info msg="Daemon has completed initialization" Mar 17 17:29:39.398827 dockerd[1766]: time="2025-03-17T17:29:39.398695615Z" level=info msg="API listen on /run/docker.sock" Mar 17 17:29:39.398961 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 17 17:29:40.198346 containerd[1538]: time="2025-03-17T17:29:40.198307425Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 17:29:40.879081 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2602123523.mount: Deactivated successfully. Mar 17 17:29:41.993397 containerd[1538]: time="2025-03-17T17:29:41.993343978Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:41.994333 containerd[1538]: time="2025-03-17T17:29:41.993849844Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=29793526" Mar 17 17:29:41.995041 containerd[1538]: time="2025-03-17T17:29:41.995005414Z" level=info msg="ImageCreate event name:\"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:41.998086 containerd[1538]: time="2025-03-17T17:29:41.998050040Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:41.999465 containerd[1538]: time="2025-03-17T17:29:41.999272595Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"29790324\" in 1.800923045s" Mar 17 17:29:41.999465 containerd[1538]: time="2025-03-17T17:29:41.999314345Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\"" Mar 17 17:29:42.017428 containerd[1538]: time="2025-03-17T17:29:42.017388256Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 17:29:43.067121 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 17:29:43.080065 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:43.175622 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:43.179432 (kubelet)[2043]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:29:43.228454 kubelet[2043]: E0317 17:29:43.228362 2043 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:29:43.231699 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:29:43.231875 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:29:43.563579 containerd[1538]: time="2025-03-17T17:29:43.563462283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:43.564242 containerd[1538]: time="2025-03-17T17:29:43.564194956Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=26861169" Mar 17 17:29:43.564798 containerd[1538]: time="2025-03-17T17:29:43.564774575Z" level=info msg="ImageCreate event name:\"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:43.568605 containerd[1538]: time="2025-03-17T17:29:43.568566008Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:43.569604 containerd[1538]: time="2025-03-17T17:29:43.569573986Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"28301963\" in 1.552146099s" Mar 17 17:29:43.569786 containerd[1538]: time="2025-03-17T17:29:43.569689782Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\"" Mar 17 17:29:43.589184 containerd[1538]: time="2025-03-17T17:29:43.589145041Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 17:29:44.631090 containerd[1538]: time="2025-03-17T17:29:44.631044083Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:44.633178 containerd[1538]: time="2025-03-17T17:29:44.633140738Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=16264638" Mar 17 17:29:44.634213 containerd[1538]: time="2025-03-17T17:29:44.634193410Z" level=info msg="ImageCreate event name:\"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:44.637021 containerd[1538]: time="2025-03-17T17:29:44.636994384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:44.639205 containerd[1538]: time="2025-03-17T17:29:44.639159888Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"17705450\" in 1.049960965s" Mar 17 17:29:44.639205 containerd[1538]: time="2025-03-17T17:29:44.639193097Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\"" Mar 17 17:29:44.656762 containerd[1538]: time="2025-03-17T17:29:44.656696923Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 17:29:45.656100 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4203391011.mount: Deactivated successfully. Mar 17 17:29:45.969610 containerd[1538]: time="2025-03-17T17:29:45.969464610Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:45.970377 containerd[1538]: time="2025-03-17T17:29:45.970322137Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=25771850" Mar 17 17:29:45.971381 containerd[1538]: time="2025-03-17T17:29:45.971338757Z" level=info msg="ImageCreate event name:\"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:45.973173 containerd[1538]: time="2025-03-17T17:29:45.973138124Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:45.973923 containerd[1538]: time="2025-03-17T17:29:45.973833406Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"25770867\" in 1.317098964s" Mar 17 17:29:45.973923 containerd[1538]: time="2025-03-17T17:29:45.973877397Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\"" Mar 17 17:29:46.000302 containerd[1538]: time="2025-03-17T17:29:46.000265817Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 17:29:46.612160 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3031621541.mount: Deactivated successfully. Mar 17 17:29:47.177476 containerd[1538]: time="2025-03-17T17:29:47.177427643Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:47.178463 containerd[1538]: time="2025-03-17T17:29:47.178218310Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485383" Mar 17 17:29:47.181619 containerd[1538]: time="2025-03-17T17:29:47.181549560Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:47.184809 containerd[1538]: time="2025-03-17T17:29:47.184746033Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:47.186079 containerd[1538]: time="2025-03-17T17:29:47.186043883Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.185734745s" Mar 17 17:29:47.186276 containerd[1538]: time="2025-03-17T17:29:47.186161658Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 17 17:29:47.205417 containerd[1538]: time="2025-03-17T17:29:47.205371942Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 17:29:47.729818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount179684112.mount: Deactivated successfully. Mar 17 17:29:47.734886 containerd[1538]: time="2025-03-17T17:29:47.734725495Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:47.738034 containerd[1538]: time="2025-03-17T17:29:47.737983150Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268823" Mar 17 17:29:47.739774 containerd[1538]: time="2025-03-17T17:29:47.739732547Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:47.742586 containerd[1538]: time="2025-03-17T17:29:47.742551985Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:47.744116 containerd[1538]: time="2025-03-17T17:29:47.744081635Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 538.661482ms" Mar 17 17:29:47.744170 containerd[1538]: time="2025-03-17T17:29:47.744115240Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Mar 17 17:29:47.762942 containerd[1538]: time="2025-03-17T17:29:47.762903829Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 17:29:48.228563 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount454313115.mount: Deactivated successfully. Mar 17 17:29:49.952103 containerd[1538]: time="2025-03-17T17:29:49.952047081Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:49.953289 containerd[1538]: time="2025-03-17T17:29:49.953207170Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=66191474" Mar 17 17:29:49.953965 containerd[1538]: time="2025-03-17T17:29:49.953923381Z" level=info msg="ImageCreate event name:\"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:49.957548 containerd[1538]: time="2025-03-17T17:29:49.957493974Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:29:49.958875 containerd[1538]: time="2025-03-17T17:29:49.958796020Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"66189079\" in 2.19585092s" Mar 17 17:29:49.958875 containerd[1538]: time="2025-03-17T17:29:49.958828924Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" Mar 17 17:29:53.482184 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 17:29:53.495040 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:53.706447 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:53.711907 (kubelet)[2278]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:29:53.752943 kubelet[2278]: E0317 17:29:53.752768 2278 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:29:53.755574 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:29:53.755790 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:29:56.310248 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:56.327075 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:56.344020 systemd[1]: Reloading requested from client PID 2298 ('systemctl') (unit session-7.scope)... Mar 17 17:29:56.344035 systemd[1]: Reloading... Mar 17 17:29:56.408901 zram_generator::config[2337]: No configuration found. Mar 17 17:29:56.521129 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:29:56.569443 systemd[1]: Reloading finished in 225 ms. Mar 17 17:29:56.616153 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 17 17:29:56.616218 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 17 17:29:56.616477 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:56.618790 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:29:56.709955 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:29:56.715109 (kubelet)[2395]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:29:56.756915 kubelet[2395]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:29:56.756915 kubelet[2395]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:29:56.756915 kubelet[2395]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:29:56.757307 kubelet[2395]: I0317 17:29:56.757111 2395 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:29:57.284712 kubelet[2395]: I0317 17:29:57.284653 2395 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:29:57.284712 kubelet[2395]: I0317 17:29:57.284682 2395 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:29:57.285173 kubelet[2395]: I0317 17:29:57.285143 2395 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:29:57.331348 kubelet[2395]: E0317 17:29:57.331308 2395 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.79:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.331484 kubelet[2395]: I0317 17:29:57.331441 2395 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:29:57.340494 kubelet[2395]: I0317 17:29:57.340468 2395 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:29:57.343380 kubelet[2395]: I0317 17:29:57.343323 2395 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:29:57.343604 kubelet[2395]: I0317 17:29:57.343379 2395 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:29:57.343713 kubelet[2395]: I0317 17:29:57.343677 2395 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:29:57.343713 kubelet[2395]: I0317 17:29:57.343691 2395 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:29:57.344172 kubelet[2395]: I0317 17:29:57.344151 2395 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:29:57.349241 kubelet[2395]: I0317 17:29:57.349215 2395 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:29:57.349275 kubelet[2395]: I0317 17:29:57.349244 2395 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:29:57.349915 kubelet[2395]: I0317 17:29:57.349689 2395 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:29:57.349915 kubelet[2395]: I0317 17:29:57.349705 2395 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:29:57.349915 kubelet[2395]: W0317 17:29:57.349792 2395 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.349915 kubelet[2395]: E0317 17:29:57.349848 2395 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.350442 kubelet[2395]: W0317 17:29:57.350326 2395 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.350442 kubelet[2395]: E0317 17:29:57.350378 2395 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.351040 kubelet[2395]: I0317 17:29:57.351021 2395 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:29:57.351416 kubelet[2395]: I0317 17:29:57.351401 2395 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:29:57.351500 kubelet[2395]: W0317 17:29:57.351445 2395 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 17:29:57.352331 kubelet[2395]: I0317 17:29:57.352301 2395 server.go:1264] "Started kubelet" Mar 17 17:29:57.354477 kubelet[2395]: I0317 17:29:57.352753 2395 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:29:57.354477 kubelet[2395]: I0317 17:29:57.353068 2395 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:29:57.354477 kubelet[2395]: I0317 17:29:57.353113 2395 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:29:57.354477 kubelet[2395]: I0317 17:29:57.353384 2395 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:29:57.354477 kubelet[2395]: I0317 17:29:57.354215 2395 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:29:57.362032 kubelet[2395]: E0317 17:29:57.360964 2395 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.79:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.79:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182da7510949f65d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-17 17:29:57.352281693 +0000 UTC m=+0.634026125,LastTimestamp:2025-03-17 17:29:57.352281693 +0000 UTC m=+0.634026125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 17 17:29:57.362898 kubelet[2395]: I0317 17:29:57.362879 2395 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:29:57.364580 kubelet[2395]: I0317 17:29:57.364542 2395 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:29:57.365720 kubelet[2395]: I0317 17:29:57.365695 2395 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:29:57.365983 kubelet[2395]: E0317 17:29:57.365784 2395 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.79:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.79:6443: connect: connection refused" interval="200ms" Mar 17 17:29:57.366221 kubelet[2395]: W0317 17:29:57.366133 2395 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.79:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.366221 kubelet[2395]: E0317 17:29:57.366191 2395 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.79:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.368799 kubelet[2395]: I0317 17:29:57.368767 2395 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:29:57.368899 kubelet[2395]: I0317 17:29:57.368850 2395 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:29:57.369944 kubelet[2395]: E0317 17:29:57.369882 2395 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:29:57.371189 kubelet[2395]: I0317 17:29:57.371161 2395 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:29:57.374348 kubelet[2395]: I0317 17:29:57.374236 2395 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:29:57.375358 kubelet[2395]: I0317 17:29:57.375310 2395 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:29:57.375483 kubelet[2395]: I0317 17:29:57.375465 2395 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:29:57.375512 kubelet[2395]: I0317 17:29:57.375490 2395 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:29:57.375553 kubelet[2395]: E0317 17:29:57.375534 2395 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:29:57.381916 kubelet[2395]: W0317 17:29:57.381828 2395 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.79:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.381916 kubelet[2395]: E0317 17:29:57.381918 2395 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.79:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:57.389401 kubelet[2395]: I0317 17:29:57.389307 2395 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:29:57.389401 kubelet[2395]: I0317 17:29:57.389327 2395 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:29:57.389401 kubelet[2395]: I0317 17:29:57.389347 2395 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:29:57.450536 kubelet[2395]: I0317 17:29:57.450505 2395 policy_none.go:49] "None policy: Start" Mar 17 17:29:57.451411 kubelet[2395]: I0317 17:29:57.451392 2395 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:29:57.451677 kubelet[2395]: I0317 17:29:57.451538 2395 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:29:57.458200 kubelet[2395]: I0317 17:29:57.457494 2395 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:29:57.458200 kubelet[2395]: I0317 17:29:57.457676 2395 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:29:57.458200 kubelet[2395]: I0317 17:29:57.457788 2395 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:29:57.459312 kubelet[2395]: E0317 17:29:57.459289 2395 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 17 17:29:57.464358 kubelet[2395]: I0317 17:29:57.464333 2395 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:29:57.464688 kubelet[2395]: E0317 17:29:57.464666 2395 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.79:6443/api/v1/nodes\": dial tcp 10.0.0.79:6443: connect: connection refused" node="localhost" Mar 17 17:29:57.476066 kubelet[2395]: I0317 17:29:57.476025 2395 topology_manager.go:215] "Topology Admit Handler" podUID="17509ba2d3127cb7c0d3ecfbaa441a4f" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 17:29:57.477007 kubelet[2395]: I0317 17:29:57.476976 2395 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 17:29:57.477685 kubelet[2395]: I0317 17:29:57.477659 2395 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 17:29:57.566488 kubelet[2395]: E0317 17:29:57.566356 2395 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.79:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.79:6443: connect: connection refused" interval="400ms" Mar 17 17:29:57.567573 kubelet[2395]: I0317 17:29:57.567547 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/17509ba2d3127cb7c0d3ecfbaa441a4f-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"17509ba2d3127cb7c0d3ecfbaa441a4f\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:29:57.567573 kubelet[2395]: I0317 17:29:57.567574 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:29:57.567741 kubelet[2395]: I0317 17:29:57.567597 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:29:57.567741 kubelet[2395]: I0317 17:29:57.567616 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:29:57.567741 kubelet[2395]: I0317 17:29:57.567633 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/17509ba2d3127cb7c0d3ecfbaa441a4f-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"17509ba2d3127cb7c0d3ecfbaa441a4f\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:29:57.567741 kubelet[2395]: I0317 17:29:57.567647 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/17509ba2d3127cb7c0d3ecfbaa441a4f-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"17509ba2d3127cb7c0d3ecfbaa441a4f\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:29:57.567741 kubelet[2395]: I0317 17:29:57.567688 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:29:57.567979 kubelet[2395]: I0317 17:29:57.567723 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:29:57.567979 kubelet[2395]: I0317 17:29:57.567741 2395 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:29:57.666854 kubelet[2395]: I0317 17:29:57.666782 2395 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:29:57.667170 kubelet[2395]: E0317 17:29:57.667132 2395 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.79:6443/api/v1/nodes\": dial tcp 10.0.0.79:6443: connect: connection refused" node="localhost" Mar 17 17:29:57.786486 kubelet[2395]: E0317 17:29:57.786457 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:57.786956 kubelet[2395]: E0317 17:29:57.786891 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:57.787410 containerd[1538]: time="2025-03-17T17:29:57.787355461Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,}" Mar 17 17:29:57.787641 containerd[1538]: time="2025-03-17T17:29:57.787370932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:17509ba2d3127cb7c0d3ecfbaa441a4f,Namespace:kube-system,Attempt:0,}" Mar 17 17:29:57.788904 kubelet[2395]: E0317 17:29:57.788854 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:57.789256 containerd[1538]: time="2025-03-17T17:29:57.789222841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,}" Mar 17 17:29:57.967842 kubelet[2395]: E0317 17:29:57.967699 2395 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.79:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.79:6443: connect: connection refused" interval="800ms" Mar 17 17:29:58.069138 kubelet[2395]: I0317 17:29:58.069097 2395 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:29:58.069433 kubelet[2395]: E0317 17:29:58.069394 2395 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.79:6443/api/v1/nodes\": dial tcp 10.0.0.79:6443: connect: connection refused" node="localhost" Mar 17 17:29:58.182862 kubelet[2395]: W0317 17:29:58.182769 2395 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:58.182862 kubelet[2395]: E0317 17:29:58.182837 2395 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.79:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:58.296007 kubelet[2395]: W0317 17:29:58.295819 2395 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:58.296007 kubelet[2395]: E0317 17:29:58.295917 2395 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.79:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:58.339310 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1278240233.mount: Deactivated successfully. Mar 17 17:29:58.344089 containerd[1538]: time="2025-03-17T17:29:58.344033374Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:58.346427 containerd[1538]: time="2025-03-17T17:29:58.346361814Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:29:58.346981 containerd[1538]: time="2025-03-17T17:29:58.346952709Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:58.350604 containerd[1538]: time="2025-03-17T17:29:58.350562969Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:58.351205 containerd[1538]: time="2025-03-17T17:29:58.351165299Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269175" Mar 17 17:29:58.352124 containerd[1538]: time="2025-03-17T17:29:58.352100137Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:58.352556 containerd[1538]: time="2025-03-17T17:29:58.352521160Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:29:58.354477 containerd[1538]: time="2025-03-17T17:29:58.354447207Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:29:58.357253 containerd[1538]: time="2025-03-17T17:29:58.357217660Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 567.932015ms" Mar 17 17:29:58.358664 containerd[1538]: time="2025-03-17T17:29:58.358548374Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 571.086695ms" Mar 17 17:29:58.360831 containerd[1538]: time="2025-03-17T17:29:58.360766391Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 573.329817ms" Mar 17 17:29:58.486933 containerd[1538]: time="2025-03-17T17:29:58.486733802Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:58.486933 containerd[1538]: time="2025-03-17T17:29:58.486919746Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:58.487454 containerd[1538]: time="2025-03-17T17:29:58.486943974Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:58.487454 containerd[1538]: time="2025-03-17T17:29:58.487140952Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:58.487454 containerd[1538]: time="2025-03-17T17:29:58.487198283Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:58.487454 containerd[1538]: time="2025-03-17T17:29:58.487213835Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:58.487454 containerd[1538]: time="2025-03-17T17:29:58.487304308Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:58.487454 containerd[1538]: time="2025-03-17T17:29:58.487328936Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:58.488627 containerd[1538]: time="2025-03-17T17:29:58.488545988Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:29:58.488627 containerd[1538]: time="2025-03-17T17:29:58.488592005Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:29:58.488627 containerd[1538]: time="2025-03-17T17:29:58.488603159Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:58.488738 containerd[1538]: time="2025-03-17T17:29:58.488665247Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:29:58.530154 containerd[1538]: time="2025-03-17T17:29:58.530112130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:17509ba2d3127cb7c0d3ecfbaa441a4f,Namespace:kube-system,Attempt:0,} returns sandbox id \"63123e664783d047d328b4d7ac6efcc63ce76885dc6f6affb69a1b73eb517890\"" Mar 17 17:29:58.531567 kubelet[2395]: E0317 17:29:58.531540 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:58.534888 containerd[1538]: time="2025-03-17T17:29:58.534838415Z" level=info msg="CreateContainer within sandbox \"63123e664783d047d328b4d7ac6efcc63ce76885dc6f6affb69a1b73eb517890\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 17:29:58.535172 containerd[1538]: time="2025-03-17T17:29:58.534881353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"02fac1b66ec2ed5688f8bd757fdf70dba33ca7432a682dc14b09addf024fb1ec\"" Mar 17 17:29:58.536775 kubelet[2395]: E0317 17:29:58.536680 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:58.537396 containerd[1538]: time="2025-03-17T17:29:58.537302105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,} returns sandbox id \"c3efa09a1c8d36302ffa404e95309f80dfd64921c3160e9b675ae8ae402d648c\"" Mar 17 17:29:58.537932 kubelet[2395]: E0317 17:29:58.537852 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:58.539580 containerd[1538]: time="2025-03-17T17:29:58.539549867Z" level=info msg="CreateContainer within sandbox \"02fac1b66ec2ed5688f8bd757fdf70dba33ca7432a682dc14b09addf024fb1ec\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 17:29:58.539946 containerd[1538]: time="2025-03-17T17:29:58.539900366Z" level=info msg="CreateContainer within sandbox \"c3efa09a1c8d36302ffa404e95309f80dfd64921c3160e9b675ae8ae402d648c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 17:29:58.560165 containerd[1538]: time="2025-03-17T17:29:58.560032513Z" level=info msg="CreateContainer within sandbox \"63123e664783d047d328b4d7ac6efcc63ce76885dc6f6affb69a1b73eb517890\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"712ae7a0ba15b13a37164fe68e7d89307e5630da1f1b6da0b725949307ef21a6\"" Mar 17 17:29:58.561719 containerd[1538]: time="2025-03-17T17:29:58.561684381Z" level=info msg="StartContainer for \"712ae7a0ba15b13a37164fe68e7d89307e5630da1f1b6da0b725949307ef21a6\"" Mar 17 17:29:58.562478 containerd[1538]: time="2025-03-17T17:29:58.562398893Z" level=info msg="CreateContainer within sandbox \"02fac1b66ec2ed5688f8bd757fdf70dba33ca7432a682dc14b09addf024fb1ec\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"cf63f9b9a5897b47c917b35326a1a5a5d83e4cb3fd27fc90dfa893cfa63d6e74\"" Mar 17 17:29:58.563122 containerd[1538]: time="2025-03-17T17:29:58.563091057Z" level=info msg="StartContainer for \"cf63f9b9a5897b47c917b35326a1a5a5d83e4cb3fd27fc90dfa893cfa63d6e74\"" Mar 17 17:29:58.565250 containerd[1538]: time="2025-03-17T17:29:58.565160790Z" level=info msg="CreateContainer within sandbox \"c3efa09a1c8d36302ffa404e95309f80dfd64921c3160e9b675ae8ae402d648c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c8fbf07edd3078719e736092082c890add6362dc9f6eb0b4950a8e65bfac8d2d\"" Mar 17 17:29:58.565759 containerd[1538]: time="2025-03-17T17:29:58.565736333Z" level=info msg="StartContainer for \"c8fbf07edd3078719e736092082c890add6362dc9f6eb0b4950a8e65bfac8d2d\"" Mar 17 17:29:58.613101 kubelet[2395]: W0317 17:29:58.613025 2395 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.79:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:58.613330 kubelet[2395]: E0317 17:29:58.613314 2395 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.79:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.79:6443: connect: connection refused Mar 17 17:29:58.621300 containerd[1538]: time="2025-03-17T17:29:58.621106882Z" level=info msg="StartContainer for \"712ae7a0ba15b13a37164fe68e7d89307e5630da1f1b6da0b725949307ef21a6\" returns successfully" Mar 17 17:29:58.632615 containerd[1538]: time="2025-03-17T17:29:58.632537472Z" level=info msg="StartContainer for \"cf63f9b9a5897b47c917b35326a1a5a5d83e4cb3fd27fc90dfa893cfa63d6e74\" returns successfully" Mar 17 17:29:58.643573 containerd[1538]: time="2025-03-17T17:29:58.643532686Z" level=info msg="StartContainer for \"c8fbf07edd3078719e736092082c890add6362dc9f6eb0b4950a8e65bfac8d2d\" returns successfully" Mar 17 17:29:58.772643 kubelet[2395]: E0317 17:29:58.772588 2395 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.79:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.79:6443: connect: connection refused" interval="1.6s" Mar 17 17:29:58.870847 kubelet[2395]: I0317 17:29:58.870748 2395 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:29:59.391951 kubelet[2395]: E0317 17:29:59.391878 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:59.394156 kubelet[2395]: E0317 17:29:59.394121 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:29:59.395509 kubelet[2395]: E0317 17:29:59.395475 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:00.399756 kubelet[2395]: E0317 17:30:00.399683 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:00.630146 kubelet[2395]: E0317 17:30:00.630083 2395 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 17 17:30:00.711893 kubelet[2395]: I0317 17:30:00.711762 2395 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 17:30:00.731324 kubelet[2395]: E0317 17:30:00.731286 2395 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:30:00.831848 kubelet[2395]: E0317 17:30:00.831785 2395 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:30:00.932405 kubelet[2395]: E0317 17:30:00.932355 2395 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:30:01.352859 kubelet[2395]: I0317 17:30:01.352819 2395 apiserver.go:52] "Watching apiserver" Mar 17 17:30:01.365385 kubelet[2395]: I0317 17:30:01.365356 2395 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:30:01.403185 kubelet[2395]: E0317 17:30:01.403148 2395 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 17 17:30:01.403580 kubelet[2395]: E0317 17:30:01.403561 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:01.872336 kubelet[2395]: E0317 17:30:01.872296 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:02.336824 kubelet[2395]: E0317 17:30:02.336722 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:02.399208 kubelet[2395]: E0317 17:30:02.399168 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:02.399746 kubelet[2395]: E0317 17:30:02.399713 2395 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:02.679095 systemd[1]: Reloading requested from client PID 2671 ('systemctl') (unit session-7.scope)... Mar 17 17:30:02.679412 systemd[1]: Reloading... Mar 17 17:30:02.734951 zram_generator::config[2710]: No configuration found. Mar 17 17:30:02.911008 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:30:02.965715 systemd[1]: Reloading finished in 285 ms. Mar 17 17:30:02.991517 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:30:03.002697 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:30:03.003102 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:30:03.012283 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:30:03.096860 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:30:03.100818 (kubelet)[2762]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:30:03.143597 kubelet[2762]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:30:03.143597 kubelet[2762]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:30:03.143597 kubelet[2762]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:30:03.143597 kubelet[2762]: I0317 17:30:03.142824 2762 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:30:03.147234 kubelet[2762]: I0317 17:30:03.147207 2762 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 17:30:03.147234 kubelet[2762]: I0317 17:30:03.147230 2762 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:30:03.147414 kubelet[2762]: I0317 17:30:03.147398 2762 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 17:30:03.148756 kubelet[2762]: I0317 17:30:03.148722 2762 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 17:30:03.149925 kubelet[2762]: I0317 17:30:03.149898 2762 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:30:03.155034 kubelet[2762]: I0317 17:30:03.155013 2762 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:30:03.155404 kubelet[2762]: I0317 17:30:03.155378 2762 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:30:03.155572 kubelet[2762]: I0317 17:30:03.155408 2762 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 17:30:03.155635 kubelet[2762]: I0317 17:30:03.155580 2762 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:30:03.155635 kubelet[2762]: I0317 17:30:03.155589 2762 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 17:30:03.155635 kubelet[2762]: I0317 17:30:03.155621 2762 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:30:03.155721 kubelet[2762]: I0317 17:30:03.155710 2762 kubelet.go:400] "Attempting to sync node with API server" Mar 17 17:30:03.155747 kubelet[2762]: I0317 17:30:03.155723 2762 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:30:03.155771 kubelet[2762]: I0317 17:30:03.155752 2762 kubelet.go:312] "Adding apiserver pod source" Mar 17 17:30:03.155771 kubelet[2762]: I0317 17:30:03.155767 2762 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:30:03.157130 kubelet[2762]: I0317 17:30:03.156979 2762 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:30:03.160062 kubelet[2762]: I0317 17:30:03.160032 2762 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:30:03.161640 kubelet[2762]: I0317 17:30:03.160794 2762 server.go:1264] "Started kubelet" Mar 17 17:30:03.166492 kubelet[2762]: I0317 17:30:03.165373 2762 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:30:03.167281 kubelet[2762]: I0317 17:30:03.167260 2762 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:30:03.167368 kubelet[2762]: I0317 17:30:03.167343 2762 server.go:455] "Adding debug handlers to kubelet server" Mar 17 17:30:03.170063 kubelet[2762]: I0317 17:30:03.165309 2762 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:30:03.170063 kubelet[2762]: I0317 17:30:03.169796 2762 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:30:03.173892 kubelet[2762]: E0317 17:30:03.173579 2762 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:30:03.173892 kubelet[2762]: I0317 17:30:03.173616 2762 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 17:30:03.173892 kubelet[2762]: I0317 17:30:03.173692 2762 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 17:30:03.173892 kubelet[2762]: I0317 17:30:03.173818 2762 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:30:03.177437 kubelet[2762]: I0317 17:30:03.176732 2762 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:30:03.177437 kubelet[2762]: I0317 17:30:03.177101 2762 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:30:03.179058 kubelet[2762]: E0317 17:30:03.179033 2762 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:30:03.179103 kubelet[2762]: I0317 17:30:03.179093 2762 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:30:03.179938 kubelet[2762]: I0317 17:30:03.179776 2762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:30:03.184073 kubelet[2762]: I0317 17:30:03.184035 2762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:30:03.184139 kubelet[2762]: I0317 17:30:03.184081 2762 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:30:03.184139 kubelet[2762]: I0317 17:30:03.184099 2762 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 17:30:03.184176 kubelet[2762]: E0317 17:30:03.184144 2762 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:30:03.220620 kubelet[2762]: I0317 17:30:03.220524 2762 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:30:03.220620 kubelet[2762]: I0317 17:30:03.220546 2762 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:30:03.220620 kubelet[2762]: I0317 17:30:03.220567 2762 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:30:03.220755 kubelet[2762]: I0317 17:30:03.220717 2762 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 17:30:03.220755 kubelet[2762]: I0317 17:30:03.220728 2762 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 17:30:03.220755 kubelet[2762]: I0317 17:30:03.220748 2762 policy_none.go:49] "None policy: Start" Mar 17 17:30:03.222126 kubelet[2762]: I0317 17:30:03.222020 2762 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:30:03.222126 kubelet[2762]: I0317 17:30:03.222050 2762 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:30:03.222223 kubelet[2762]: I0317 17:30:03.222195 2762 state_mem.go:75] "Updated machine memory state" Mar 17 17:30:03.223314 kubelet[2762]: I0317 17:30:03.223288 2762 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:30:03.223957 kubelet[2762]: I0317 17:30:03.223461 2762 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:30:03.223957 kubelet[2762]: I0317 17:30:03.223556 2762 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:30:03.277944 kubelet[2762]: I0317 17:30:03.277916 2762 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 17:30:03.284560 kubelet[2762]: I0317 17:30:03.284506 2762 topology_manager.go:215] "Topology Admit Handler" podUID="17509ba2d3127cb7c0d3ecfbaa441a4f" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 17:30:03.284668 kubelet[2762]: I0317 17:30:03.284621 2762 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 17:30:03.284691 kubelet[2762]: I0317 17:30:03.284663 2762 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Mar 17 17:30:03.284691 kubelet[2762]: I0317 17:30:03.284673 2762 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 17:30:03.284756 kubelet[2762]: I0317 17:30:03.284735 2762 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 17:30:03.290667 kubelet[2762]: E0317 17:30:03.290535 2762 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 17 17:30:03.291229 kubelet[2762]: E0317 17:30:03.291196 2762 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 17 17:30:03.374781 kubelet[2762]: I0317 17:30:03.374716 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:30:03.374781 kubelet[2762]: I0317 17:30:03.374766 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:30:03.374781 kubelet[2762]: I0317 17:30:03.374794 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/17509ba2d3127cb7c0d3ecfbaa441a4f-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"17509ba2d3127cb7c0d3ecfbaa441a4f\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:30:03.375000 kubelet[2762]: I0317 17:30:03.374810 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:30:03.375000 kubelet[2762]: I0317 17:30:03.374827 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:30:03.375000 kubelet[2762]: I0317 17:30:03.374841 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:30:03.375000 kubelet[2762]: I0317 17:30:03.374856 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/17509ba2d3127cb7c0d3ecfbaa441a4f-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"17509ba2d3127cb7c0d3ecfbaa441a4f\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:30:03.375000 kubelet[2762]: I0317 17:30:03.374893 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/17509ba2d3127cb7c0d3ecfbaa441a4f-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"17509ba2d3127cb7c0d3ecfbaa441a4f\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:30:03.375098 kubelet[2762]: I0317 17:30:03.374911 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:30:03.591609 kubelet[2762]: E0317 17:30:03.591493 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:03.592302 kubelet[2762]: E0317 17:30:03.591894 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:03.592302 kubelet[2762]: E0317 17:30:03.592230 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:04.156093 kubelet[2762]: I0317 17:30:04.156044 2762 apiserver.go:52] "Watching apiserver" Mar 17 17:30:04.174655 kubelet[2762]: I0317 17:30:04.174615 2762 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 17:30:04.198297 kubelet[2762]: E0317 17:30:04.197920 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:04.198297 kubelet[2762]: E0317 17:30:04.198021 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:04.204953 kubelet[2762]: E0317 17:30:04.204913 2762 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 17 17:30:04.205940 kubelet[2762]: E0317 17:30:04.205833 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:04.222974 kubelet[2762]: I0317 17:30:04.222898 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.222880881 podStartE2EDuration="2.222880881s" podCreationTimestamp="2025-03-17 17:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:04.21526247 +0000 UTC m=+1.111022359" watchObservedRunningTime="2025-03-17 17:30:04.222880881 +0000 UTC m=+1.118640810" Mar 17 17:30:04.239318 kubelet[2762]: I0317 17:30:04.239261 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.239245557 podStartE2EDuration="3.239245557s" podCreationTimestamp="2025-03-17 17:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:04.223243845 +0000 UTC m=+1.119003734" watchObservedRunningTime="2025-03-17 17:30:04.239245557 +0000 UTC m=+1.135005446" Mar 17 17:30:04.251197 kubelet[2762]: I0317 17:30:04.251144 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.25112562 podStartE2EDuration="1.25112562s" podCreationTimestamp="2025-03-17 17:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:04.23945772 +0000 UTC m=+1.135217609" watchObservedRunningTime="2025-03-17 17:30:04.25112562 +0000 UTC m=+1.146885509" Mar 17 17:30:05.199261 kubelet[2762]: E0317 17:30:05.199229 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:06.200929 kubelet[2762]: E0317 17:30:06.200799 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:08.107638 sudo[1746]: pam_unix(sudo:session): session closed for user root Mar 17 17:30:08.109836 sshd[1745]: Connection closed by 10.0.0.1 port 34022 Mar 17 17:30:08.110432 sshd-session[1739]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:08.115521 systemd[1]: sshd@6-10.0.0.79:22-10.0.0.1:34022.service: Deactivated successfully. Mar 17 17:30:08.118732 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 17:30:08.120049 systemd-logind[1523]: Session 7 logged out. Waiting for processes to exit. Mar 17 17:30:08.121095 systemd-logind[1523]: Removed session 7. Mar 17 17:30:08.792396 kubelet[2762]: E0317 17:30:08.792328 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:09.204980 kubelet[2762]: E0317 17:30:09.204881 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:10.816818 kubelet[2762]: E0317 17:30:10.816768 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:10.878198 kubelet[2762]: E0317 17:30:10.878148 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:11.215378 kubelet[2762]: E0317 17:30:11.214478 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:11.215378 kubelet[2762]: E0317 17:30:11.215139 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:12.215841 kubelet[2762]: E0317 17:30:12.215799 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:17.121937 update_engine[1526]: I20250317 17:30:17.121642 1526 update_attempter.cc:509] Updating boot flags... Mar 17 17:30:17.156893 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (2859) Mar 17 17:30:17.188182 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 40 scanned by (udev-worker) (2862) Mar 17 17:30:18.064395 kubelet[2762]: I0317 17:30:18.064358 2762 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 17:30:18.079161 containerd[1538]: time="2025-03-17T17:30:18.079099718Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 17:30:18.084888 kubelet[2762]: I0317 17:30:18.083104 2762 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 17:30:19.002895 kubelet[2762]: I0317 17:30:18.999879 2762 topology_manager.go:215] "Topology Admit Handler" podUID="e8fcad68-1cac-4e30-8598-02a6e80b3e53" podNamespace="kube-system" podName="kube-proxy-jc2xn" Mar 17 17:30:19.115483 kubelet[2762]: I0317 17:30:19.115443 2762 topology_manager.go:215] "Topology Admit Handler" podUID="177a2aad-3524-4c90-a3a4-2829045d830d" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-6fzkf" Mar 17 17:30:19.180812 kubelet[2762]: I0317 17:30:19.180765 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e8fcad68-1cac-4e30-8598-02a6e80b3e53-xtables-lock\") pod \"kube-proxy-jc2xn\" (UID: \"e8fcad68-1cac-4e30-8598-02a6e80b3e53\") " pod="kube-system/kube-proxy-jc2xn" Mar 17 17:30:19.180812 kubelet[2762]: I0317 17:30:19.180807 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8fcad68-1cac-4e30-8598-02a6e80b3e53-lib-modules\") pod \"kube-proxy-jc2xn\" (UID: \"e8fcad68-1cac-4e30-8598-02a6e80b3e53\") " pod="kube-system/kube-proxy-jc2xn" Mar 17 17:30:19.180812 kubelet[2762]: I0317 17:30:19.180826 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e8fcad68-1cac-4e30-8598-02a6e80b3e53-kube-proxy\") pod \"kube-proxy-jc2xn\" (UID: \"e8fcad68-1cac-4e30-8598-02a6e80b3e53\") " pod="kube-system/kube-proxy-jc2xn" Mar 17 17:30:19.181017 kubelet[2762]: I0317 17:30:19.180843 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n69c\" (UniqueName: \"kubernetes.io/projected/e8fcad68-1cac-4e30-8598-02a6e80b3e53-kube-api-access-5n69c\") pod \"kube-proxy-jc2xn\" (UID: \"e8fcad68-1cac-4e30-8598-02a6e80b3e53\") " pod="kube-system/kube-proxy-jc2xn" Mar 17 17:30:19.282169 kubelet[2762]: I0317 17:30:19.281811 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/177a2aad-3524-4c90-a3a4-2829045d830d-var-lib-calico\") pod \"tigera-operator-6479d6dc54-6fzkf\" (UID: \"177a2aad-3524-4c90-a3a4-2829045d830d\") " pod="tigera-operator/tigera-operator-6479d6dc54-6fzkf" Mar 17 17:30:19.282169 kubelet[2762]: I0317 17:30:19.281878 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qtk\" (UniqueName: \"kubernetes.io/projected/177a2aad-3524-4c90-a3a4-2829045d830d-kube-api-access-r5qtk\") pod \"tigera-operator-6479d6dc54-6fzkf\" (UID: \"177a2aad-3524-4c90-a3a4-2829045d830d\") " pod="tigera-operator/tigera-operator-6479d6dc54-6fzkf" Mar 17 17:30:19.303611 kubelet[2762]: E0317 17:30:19.303570 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:19.304482 containerd[1538]: time="2025-03-17T17:30:19.304434566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jc2xn,Uid:e8fcad68-1cac-4e30-8598-02a6e80b3e53,Namespace:kube-system,Attempt:0,}" Mar 17 17:30:19.330265 containerd[1538]: time="2025-03-17T17:30:19.330175305Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:19.330265 containerd[1538]: time="2025-03-17T17:30:19.330230905Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:19.330265 containerd[1538]: time="2025-03-17T17:30:19.330241825Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:19.330461 containerd[1538]: time="2025-03-17T17:30:19.330314585Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:19.357116 containerd[1538]: time="2025-03-17T17:30:19.357063810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jc2xn,Uid:e8fcad68-1cac-4e30-8598-02a6e80b3e53,Namespace:kube-system,Attempt:0,} returns sandbox id \"c5564f22ea39d593a168b4bd4c4f13fad24c8895c190a13fd73226f4c39e4050\"" Mar 17 17:30:19.359534 kubelet[2762]: E0317 17:30:19.359513 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:19.371316 containerd[1538]: time="2025-03-17T17:30:19.371041965Z" level=info msg="CreateContainer within sandbox \"c5564f22ea39d593a168b4bd4c4f13fad24c8895c190a13fd73226f4c39e4050\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 17:30:19.401301 containerd[1538]: time="2025-03-17T17:30:19.401223647Z" level=info msg="CreateContainer within sandbox \"c5564f22ea39d593a168b4bd4c4f13fad24c8895c190a13fd73226f4c39e4050\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"09b973967ebdb8cdc2ce6b3abcf66f0275f70a5da136e17ab0d44454615d503d\"" Mar 17 17:30:19.403669 containerd[1538]: time="2025-03-17T17:30:19.403642420Z" level=info msg="StartContainer for \"09b973967ebdb8cdc2ce6b3abcf66f0275f70a5da136e17ab0d44454615d503d\"" Mar 17 17:30:19.420517 containerd[1538]: time="2025-03-17T17:30:19.420196670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-6fzkf,Uid:177a2aad-3524-4c90-a3a4-2829045d830d,Namespace:tigera-operator,Attempt:0,}" Mar 17 17:30:19.439838 containerd[1538]: time="2025-03-17T17:30:19.439592414Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:19.439838 containerd[1538]: time="2025-03-17T17:30:19.439650294Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:19.439838 containerd[1538]: time="2025-03-17T17:30:19.439665094Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:19.439838 containerd[1538]: time="2025-03-17T17:30:19.439746335Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:19.465018 containerd[1538]: time="2025-03-17T17:30:19.464975951Z" level=info msg="StartContainer for \"09b973967ebdb8cdc2ce6b3abcf66f0275f70a5da136e17ab0d44454615d503d\" returns successfully" Mar 17 17:30:19.482254 containerd[1538]: time="2025-03-17T17:30:19.482148243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-6fzkf,Uid:177a2aad-3524-4c90-a3a4-2829045d830d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c2e06e8d9a9e5a386bc4965437a9bb8a2bd6c138bf99b81797df8c064dff7ca6\"" Mar 17 17:30:19.486259 containerd[1538]: time="2025-03-17T17:30:19.485963664Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 17 17:30:20.240378 kubelet[2762]: E0317 17:30:20.240351 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:20.301350 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2025049205.mount: Deactivated successfully. Mar 17 17:30:22.277838 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3828486957.mount: Deactivated successfully. Mar 17 17:30:22.795910 containerd[1538]: time="2025-03-17T17:30:22.795847740Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:22.796386 containerd[1538]: time="2025-03-17T17:30:22.796328342Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 17 17:30:22.797507 containerd[1538]: time="2025-03-17T17:30:22.797050705Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:22.799781 containerd[1538]: time="2025-03-17T17:30:22.799748158Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:22.800682 containerd[1538]: time="2025-03-17T17:30:22.800644642Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 3.314646898s" Mar 17 17:30:22.800720 containerd[1538]: time="2025-03-17T17:30:22.800681842Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 17 17:30:22.818675 containerd[1538]: time="2025-03-17T17:30:22.818565126Z" level=info msg="CreateContainer within sandbox \"c2e06e8d9a9e5a386bc4965437a9bb8a2bd6c138bf99b81797df8c064dff7ca6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 17:30:22.830384 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3567705954.mount: Deactivated successfully. Mar 17 17:30:22.831861 containerd[1538]: time="2025-03-17T17:30:22.831815708Z" level=info msg="CreateContainer within sandbox \"c2e06e8d9a9e5a386bc4965437a9bb8a2bd6c138bf99b81797df8c064dff7ca6\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"4423c25b3a9bc7c7f19bf9f378839301855bbb7df9a868bc6f6d4b2b8643081c\"" Mar 17 17:30:22.833663 containerd[1538]: time="2025-03-17T17:30:22.833590436Z" level=info msg="StartContainer for \"4423c25b3a9bc7c7f19bf9f378839301855bbb7df9a868bc6f6d4b2b8643081c\"" Mar 17 17:30:22.880378 containerd[1538]: time="2025-03-17T17:30:22.880332334Z" level=info msg="StartContainer for \"4423c25b3a9bc7c7f19bf9f378839301855bbb7df9a868bc6f6d4b2b8643081c\" returns successfully" Mar 17 17:30:23.206421 kubelet[2762]: I0317 17:30:23.206166 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jc2xn" podStartSLOduration=5.206150094 podStartE2EDuration="5.206150094s" podCreationTimestamp="2025-03-17 17:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:20.250218517 +0000 UTC m=+17.145978406" watchObservedRunningTime="2025-03-17 17:30:23.206150094 +0000 UTC m=+20.101909983" Mar 17 17:30:23.258513 kubelet[2762]: I0317 17:30:23.258143 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-6fzkf" podStartSLOduration=0.928251035 podStartE2EDuration="4.258124846s" podCreationTimestamp="2025-03-17 17:30:19 +0000 UTC" firstStartedPulling="2025-03-17 17:30:19.483271049 +0000 UTC m=+16.379030898" lastFinishedPulling="2025-03-17 17:30:22.81314482 +0000 UTC m=+19.708904709" observedRunningTime="2025-03-17 17:30:23.257616884 +0000 UTC m=+20.153376773" watchObservedRunningTime="2025-03-17 17:30:23.258124846 +0000 UTC m=+20.153884735" Mar 17 17:30:27.925417 kubelet[2762]: I0317 17:30:27.925370 2762 topology_manager.go:215] "Topology Admit Handler" podUID="a48bedba-d487-4a31-a167-e0951d661000" podNamespace="calico-system" podName="calico-typha-5f5dcc6d5-x8njv" Mar 17 17:30:27.940011 kubelet[2762]: I0317 17:30:27.939913 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a48bedba-d487-4a31-a167-e0951d661000-typha-certs\") pod \"calico-typha-5f5dcc6d5-x8njv\" (UID: \"a48bedba-d487-4a31-a167-e0951d661000\") " pod="calico-system/calico-typha-5f5dcc6d5-x8njv" Mar 17 17:30:27.940702 kubelet[2762]: I0317 17:30:27.940646 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgs6n\" (UniqueName: \"kubernetes.io/projected/a48bedba-d487-4a31-a167-e0951d661000-kube-api-access-pgs6n\") pod \"calico-typha-5f5dcc6d5-x8njv\" (UID: \"a48bedba-d487-4a31-a167-e0951d661000\") " pod="calico-system/calico-typha-5f5dcc6d5-x8njv" Mar 17 17:30:27.941801 kubelet[2762]: I0317 17:30:27.941660 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a48bedba-d487-4a31-a167-e0951d661000-tigera-ca-bundle\") pod \"calico-typha-5f5dcc6d5-x8njv\" (UID: \"a48bedba-d487-4a31-a167-e0951d661000\") " pod="calico-system/calico-typha-5f5dcc6d5-x8njv" Mar 17 17:30:28.125183 kubelet[2762]: I0317 17:30:28.125119 2762 topology_manager.go:215] "Topology Admit Handler" podUID="aaa58314-dca5-4aee-82d7-a48c9a1f3413" podNamespace="calico-system" podName="calico-node-ncp6c" Mar 17 17:30:28.142150 kubelet[2762]: I0317 17:30:28.142063 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-cni-net-dir\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142528 kubelet[2762]: I0317 17:30:28.142317 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk88g\" (UniqueName: \"kubernetes.io/projected/aaa58314-dca5-4aee-82d7-a48c9a1f3413-kube-api-access-tk88g\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142528 kubelet[2762]: I0317 17:30:28.142349 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-flexvol-driver-host\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142528 kubelet[2762]: I0317 17:30:28.142381 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-xtables-lock\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142528 kubelet[2762]: I0317 17:30:28.142397 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-cni-log-dir\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142528 kubelet[2762]: I0317 17:30:28.142412 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-lib-modules\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142739 kubelet[2762]: I0317 17:30:28.142426 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aaa58314-dca5-4aee-82d7-a48c9a1f3413-tigera-ca-bundle\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142739 kubelet[2762]: I0317 17:30:28.142452 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-policysync\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142739 kubelet[2762]: I0317 17:30:28.142471 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-var-run-calico\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142739 kubelet[2762]: I0317 17:30:28.142488 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-cni-bin-dir\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142739 kubelet[2762]: I0317 17:30:28.142506 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aaa58314-dca5-4aee-82d7-a48c9a1f3413-var-lib-calico\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.142910 kubelet[2762]: I0317 17:30:28.142702 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aaa58314-dca5-4aee-82d7-a48c9a1f3413-node-certs\") pod \"calico-node-ncp6c\" (UID: \"aaa58314-dca5-4aee-82d7-a48c9a1f3413\") " pod="calico-system/calico-node-ncp6c" Mar 17 17:30:28.233151 kubelet[2762]: E0317 17:30:28.233021 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:28.233591 containerd[1538]: time="2025-03-17T17:30:28.233541217Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5f5dcc6d5-x8njv,Uid:a48bedba-d487-4a31-a167-e0951d661000,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:28.253822 kubelet[2762]: E0317 17:30:28.253732 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.254078 kubelet[2762]: W0317 17:30:28.253984 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.254078 kubelet[2762]: E0317 17:30:28.254013 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.260329 kubelet[2762]: E0317 17:30:28.259539 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.260329 kubelet[2762]: W0317 17:30:28.259557 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.260329 kubelet[2762]: E0317 17:30:28.259573 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.267833 containerd[1538]: time="2025-03-17T17:30:28.267494419Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:28.267833 containerd[1538]: time="2025-03-17T17:30:28.267559340Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:28.267833 containerd[1538]: time="2025-03-17T17:30:28.267573820Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:28.268937 containerd[1538]: time="2025-03-17T17:30:28.268807984Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:28.319803 containerd[1538]: time="2025-03-17T17:30:28.319745087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5f5dcc6d5-x8njv,Uid:a48bedba-d487-4a31-a167-e0951d661000,Namespace:calico-system,Attempt:0,} returns sandbox id \"1fe858f9745e939c0416706c666739fa3eb1fb5dcb452d0ce43345f093873f48\"" Mar 17 17:30:28.320718 kubelet[2762]: E0317 17:30:28.320467 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:28.323300 containerd[1538]: time="2025-03-17T17:30:28.323262340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 17 17:30:28.433748 kubelet[2762]: E0317 17:30:28.433701 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:28.434621 containerd[1538]: time="2025-03-17T17:30:28.434456740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ncp6c,Uid:aaa58314-dca5-4aee-82d7-a48c9a1f3413,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:28.457896 kubelet[2762]: I0317 17:30:28.453214 2762 topology_manager.go:215] "Topology Admit Handler" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" podNamespace="calico-system" podName="csi-node-driver-ddmk9" Mar 17 17:30:28.457896 kubelet[2762]: E0317 17:30:28.453507 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:28.502203 containerd[1538]: time="2025-03-17T17:30:28.501933383Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:28.502203 containerd[1538]: time="2025-03-17T17:30:28.502011264Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:28.502203 containerd[1538]: time="2025-03-17T17:30:28.502027864Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:28.502780 containerd[1538]: time="2025-03-17T17:30:28.502138504Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:28.544233 kubelet[2762]: E0317 17:30:28.544175 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.544233 kubelet[2762]: W0317 17:30:28.544223 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.544421 kubelet[2762]: E0317 17:30:28.544246 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.544952 kubelet[2762]: E0317 17:30:28.544935 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.544952 kubelet[2762]: W0317 17:30:28.544951 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.545009 kubelet[2762]: E0317 17:30:28.544963 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.545155 kubelet[2762]: E0317 17:30:28.545140 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.545155 kubelet[2762]: W0317 17:30:28.545153 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.545241 kubelet[2762]: E0317 17:30:28.545162 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.545376 kubelet[2762]: E0317 17:30:28.545364 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.545376 kubelet[2762]: W0317 17:30:28.545376 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.545444 kubelet[2762]: E0317 17:30:28.545385 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.545628 kubelet[2762]: E0317 17:30:28.545612 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.545628 kubelet[2762]: W0317 17:30:28.545626 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.545759 kubelet[2762]: E0317 17:30:28.545641 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.545841 kubelet[2762]: E0317 17:30:28.545824 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.545841 kubelet[2762]: W0317 17:30:28.545840 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.545930 kubelet[2762]: E0317 17:30:28.545851 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.546061 kubelet[2762]: E0317 17:30:28.546046 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.546061 kubelet[2762]: W0317 17:30:28.546059 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.546634 kubelet[2762]: E0317 17:30:28.546068 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.546634 kubelet[2762]: E0317 17:30:28.546231 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.546634 kubelet[2762]: W0317 17:30:28.546240 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.546634 kubelet[2762]: E0317 17:30:28.546248 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.546634 kubelet[2762]: E0317 17:30:28.546442 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.546634 kubelet[2762]: W0317 17:30:28.546452 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.546634 kubelet[2762]: E0317 17:30:28.546460 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.546790 kubelet[2762]: E0317 17:30:28.546666 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.546790 kubelet[2762]: W0317 17:30:28.546675 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.546790 kubelet[2762]: E0317 17:30:28.546683 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.546962 kubelet[2762]: E0317 17:30:28.546946 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.547025 kubelet[2762]: W0317 17:30:28.546961 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.547025 kubelet[2762]: E0317 17:30:28.546971 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.549430 kubelet[2762]: E0317 17:30:28.549396 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.549430 kubelet[2762]: W0317 17:30:28.549417 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.549430 kubelet[2762]: E0317 17:30:28.549430 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.550548 kubelet[2762]: E0317 17:30:28.549674 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.550548 kubelet[2762]: W0317 17:30:28.549691 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.550548 kubelet[2762]: E0317 17:30:28.549702 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.550548 kubelet[2762]: E0317 17:30:28.550207 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.550548 kubelet[2762]: W0317 17:30:28.550220 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.550548 kubelet[2762]: E0317 17:30:28.550230 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.550548 kubelet[2762]: E0317 17:30:28.550424 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.550548 kubelet[2762]: W0317 17:30:28.550434 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.550548 kubelet[2762]: E0317 17:30:28.550443 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.550789 kubelet[2762]: E0317 17:30:28.550609 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.550789 kubelet[2762]: W0317 17:30:28.550619 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.550789 kubelet[2762]: E0317 17:30:28.550630 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.550983 kubelet[2762]: E0317 17:30:28.550965 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.550983 kubelet[2762]: W0317 17:30:28.550981 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.551040 kubelet[2762]: E0317 17:30:28.550994 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.551144 kubelet[2762]: E0317 17:30:28.551131 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.551183 kubelet[2762]: W0317 17:30:28.551144 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.551183 kubelet[2762]: E0317 17:30:28.551154 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.551327 kubelet[2762]: E0317 17:30:28.551313 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.551353 kubelet[2762]: W0317 17:30:28.551326 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.551353 kubelet[2762]: E0317 17:30:28.551335 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.551513 kubelet[2762]: E0317 17:30:28.551500 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.551534 kubelet[2762]: W0317 17:30:28.551514 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.551534 kubelet[2762]: E0317 17:30:28.551525 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.551792 kubelet[2762]: E0317 17:30:28.551776 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.551792 kubelet[2762]: W0317 17:30:28.551790 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.551862 kubelet[2762]: E0317 17:30:28.551801 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.551862 kubelet[2762]: I0317 17:30:28.551827 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/191fd0a9-26f3-46f0-864d-1b5b729dbb52-varrun\") pod \"csi-node-driver-ddmk9\" (UID: \"191fd0a9-26f3-46f0-864d-1b5b729dbb52\") " pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:28.552083 kubelet[2762]: E0317 17:30:28.552066 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.552128 kubelet[2762]: W0317 17:30:28.552082 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.552128 kubelet[2762]: E0317 17:30:28.552098 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.552128 kubelet[2762]: I0317 17:30:28.552113 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/191fd0a9-26f3-46f0-864d-1b5b729dbb52-kubelet-dir\") pod \"csi-node-driver-ddmk9\" (UID: \"191fd0a9-26f3-46f0-864d-1b5b729dbb52\") " pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:28.552545 kubelet[2762]: E0317 17:30:28.552528 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.552579 kubelet[2762]: W0317 17:30:28.552545 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.552579 kubelet[2762]: E0317 17:30:28.552568 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.552617 kubelet[2762]: I0317 17:30:28.552583 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/191fd0a9-26f3-46f0-864d-1b5b729dbb52-registration-dir\") pod \"csi-node-driver-ddmk9\" (UID: \"191fd0a9-26f3-46f0-864d-1b5b729dbb52\") " pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:28.552790 kubelet[2762]: E0317 17:30:28.552777 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.552790 kubelet[2762]: W0317 17:30:28.552790 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.552845 kubelet[2762]: E0317 17:30:28.552803 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.552845 kubelet[2762]: I0317 17:30:28.552823 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv2m4\" (UniqueName: \"kubernetes.io/projected/191fd0a9-26f3-46f0-864d-1b5b729dbb52-kube-api-access-vv2m4\") pod \"csi-node-driver-ddmk9\" (UID: \"191fd0a9-26f3-46f0-864d-1b5b729dbb52\") " pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:28.553177 kubelet[2762]: E0317 17:30:28.553154 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.553177 kubelet[2762]: W0317 17:30:28.553176 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.553247 kubelet[2762]: E0317 17:30:28.553192 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.553247 kubelet[2762]: I0317 17:30:28.553209 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/191fd0a9-26f3-46f0-864d-1b5b729dbb52-socket-dir\") pod \"csi-node-driver-ddmk9\" (UID: \"191fd0a9-26f3-46f0-864d-1b5b729dbb52\") " pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:28.553558 kubelet[2762]: E0317 17:30:28.553535 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.553596 kubelet[2762]: W0317 17:30:28.553560 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.553596 kubelet[2762]: E0317 17:30:28.553580 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.553839 kubelet[2762]: E0317 17:30:28.553808 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.553839 kubelet[2762]: W0317 17:30:28.553821 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.553931 kubelet[2762]: E0317 17:30:28.553848 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.555107 kubelet[2762]: E0317 17:30:28.555093 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.555107 kubelet[2762]: W0317 17:30:28.555105 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.555216 kubelet[2762]: E0317 17:30:28.555200 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.555321 kubelet[2762]: E0317 17:30:28.555308 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.555321 kubelet[2762]: W0317 17:30:28.555318 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.555421 kubelet[2762]: E0317 17:30:28.555395 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.555488 kubelet[2762]: E0317 17:30:28.555474 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.555488 kubelet[2762]: W0317 17:30:28.555485 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.555571 kubelet[2762]: E0317 17:30:28.555512 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.555698 kubelet[2762]: E0317 17:30:28.555673 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.555698 kubelet[2762]: W0317 17:30:28.555686 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.555767 kubelet[2762]: E0317 17:30:28.555754 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.555849 kubelet[2762]: E0317 17:30:28.555839 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.555884 kubelet[2762]: W0317 17:30:28.555848 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.555884 kubelet[2762]: E0317 17:30:28.555856 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.556055 kubelet[2762]: E0317 17:30:28.556040 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.556055 kubelet[2762]: W0317 17:30:28.556052 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.556154 kubelet[2762]: E0317 17:30:28.556061 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.556253 kubelet[2762]: E0317 17:30:28.556242 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.556283 kubelet[2762]: W0317 17:30:28.556253 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.556283 kubelet[2762]: E0317 17:30:28.556261 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.556517 kubelet[2762]: E0317 17:30:28.556501 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.556517 kubelet[2762]: W0317 17:30:28.556513 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.556579 kubelet[2762]: E0317 17:30:28.556523 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.558945 containerd[1538]: time="2025-03-17T17:30:28.558907509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ncp6c,Uid:aaa58314-dca5-4aee-82d7-a48c9a1f3413,Namespace:calico-system,Attempt:0,} returns sandbox id \"973eb51fe8ba3ea79aa48220786a579c75ccd065878c01a9dd9ecd0fb1d93625\"" Mar 17 17:30:28.559803 kubelet[2762]: E0317 17:30:28.559784 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:28.653688 kubelet[2762]: E0317 17:30:28.653659 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.653688 kubelet[2762]: W0317 17:30:28.653680 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.653688 kubelet[2762]: E0317 17:30:28.653698 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.653952 kubelet[2762]: E0317 17:30:28.653931 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.653952 kubelet[2762]: W0317 17:30:28.653946 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.653952 kubelet[2762]: E0317 17:30:28.653960 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.654244 kubelet[2762]: E0317 17:30:28.654228 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.654325 kubelet[2762]: W0317 17:30:28.654312 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.654464 kubelet[2762]: E0317 17:30:28.654380 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.654562 kubelet[2762]: E0317 17:30:28.654550 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.654694 kubelet[2762]: W0317 17:30:28.654678 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.654770 kubelet[2762]: E0317 17:30:28.654758 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.655002 kubelet[2762]: E0317 17:30:28.654978 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.655002 kubelet[2762]: W0317 17:30:28.654999 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.655077 kubelet[2762]: E0317 17:30:28.655023 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.655359 kubelet[2762]: E0317 17:30:28.655309 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.655359 kubelet[2762]: W0317 17:30:28.655326 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.655430 kubelet[2762]: E0317 17:30:28.655357 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.655541 kubelet[2762]: E0317 17:30:28.655527 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.655541 kubelet[2762]: W0317 17:30:28.655539 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.655598 kubelet[2762]: E0317 17:30:28.655580 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.655740 kubelet[2762]: E0317 17:30:28.655728 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.655740 kubelet[2762]: W0317 17:30:28.655738 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.655804 kubelet[2762]: E0317 17:30:28.655751 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.656008 kubelet[2762]: E0317 17:30:28.655996 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.656008 kubelet[2762]: W0317 17:30:28.656008 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.656068 kubelet[2762]: E0317 17:30:28.656021 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.656219 kubelet[2762]: E0317 17:30:28.656204 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.656219 kubelet[2762]: W0317 17:30:28.656217 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.656278 kubelet[2762]: E0317 17:30:28.656230 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.656386 kubelet[2762]: E0317 17:30:28.656375 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.656386 kubelet[2762]: W0317 17:30:28.656386 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.656444 kubelet[2762]: E0317 17:30:28.656399 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.656541 kubelet[2762]: E0317 17:30:28.656522 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.656541 kubelet[2762]: W0317 17:30:28.656533 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.656699 kubelet[2762]: E0317 17:30:28.656591 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.656699 kubelet[2762]: E0317 17:30:28.656655 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.656699 kubelet[2762]: W0317 17:30:28.656664 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.656775 kubelet[2762]: E0317 17:30:28.656733 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.656833 kubelet[2762]: E0317 17:30:28.656819 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.656833 kubelet[2762]: W0317 17:30:28.656830 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.656918 kubelet[2762]: E0317 17:30:28.656877 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.657015 kubelet[2762]: E0317 17:30:28.657003 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.657015 kubelet[2762]: W0317 17:30:28.657013 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.657083 kubelet[2762]: E0317 17:30:28.657066 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.657206 kubelet[2762]: E0317 17:30:28.657195 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.657206 kubelet[2762]: W0317 17:30:28.657206 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.657264 kubelet[2762]: E0317 17:30:28.657224 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.657613 kubelet[2762]: E0317 17:30:28.657566 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.657649 kubelet[2762]: W0317 17:30:28.657614 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.657649 kubelet[2762]: E0317 17:30:28.657632 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.657855 kubelet[2762]: E0317 17:30:28.657845 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.657934 kubelet[2762]: W0317 17:30:28.657856 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.657934 kubelet[2762]: E0317 17:30:28.657879 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.658034 kubelet[2762]: E0317 17:30:28.658024 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.658068 kubelet[2762]: W0317 17:30:28.658034 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.658113 kubelet[2762]: E0317 17:30:28.658099 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.658198 kubelet[2762]: E0317 17:30:28.658189 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.658198 kubelet[2762]: W0317 17:30:28.658197 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.658303 kubelet[2762]: E0317 17:30:28.658273 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.658348 kubelet[2762]: E0317 17:30:28.658319 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.658348 kubelet[2762]: W0317 17:30:28.658326 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.658449 kubelet[2762]: E0317 17:30:28.658387 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.658488 kubelet[2762]: E0317 17:30:28.658476 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.658488 kubelet[2762]: W0317 17:30:28.658486 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.658546 kubelet[2762]: E0317 17:30:28.658499 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.658673 kubelet[2762]: E0317 17:30:28.658661 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.658673 kubelet[2762]: W0317 17:30:28.658672 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.658728 kubelet[2762]: E0317 17:30:28.658685 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.659035 kubelet[2762]: E0317 17:30:28.658982 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.659035 kubelet[2762]: W0317 17:30:28.658997 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.659035 kubelet[2762]: E0317 17:30:28.659010 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.659288 kubelet[2762]: E0317 17:30:28.659269 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.659288 kubelet[2762]: W0317 17:30:28.659283 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.659371 kubelet[2762]: E0317 17:30:28.659295 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:28.667984 kubelet[2762]: E0317 17:30:28.667951 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:28.667984 kubelet[2762]: W0317 17:30:28.667972 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:28.667984 kubelet[2762]: E0317 17:30:28.667989 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.048585 containerd[1538]: time="2025-03-17T17:30:30.048534117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:30.049600 containerd[1538]: time="2025-03-17T17:30:30.049545201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 17 17:30:30.050596 containerd[1538]: time="2025-03-17T17:30:30.050554844Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:30.052881 containerd[1538]: time="2025-03-17T17:30:30.052458050Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:30.053455 containerd[1538]: time="2025-03-17T17:30:30.053345453Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 1.730041753s" Mar 17 17:30:30.053455 containerd[1538]: time="2025-03-17T17:30:30.053376493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 17 17:30:30.055584 containerd[1538]: time="2025-03-17T17:30:30.054841138Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 17:30:30.067418 containerd[1538]: time="2025-03-17T17:30:30.067375500Z" level=info msg="CreateContainer within sandbox \"1fe858f9745e939c0416706c666739fa3eb1fb5dcb452d0ce43345f093873f48\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 17:30:30.095029 containerd[1538]: time="2025-03-17T17:30:30.094915592Z" level=info msg="CreateContainer within sandbox \"1fe858f9745e939c0416706c666739fa3eb1fb5dcb452d0ce43345f093873f48\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d56785b9a3560bfe13d75517b54aee70ecfddbb23e40b61815011a3ff0803c4e\"" Mar 17 17:30:30.096238 containerd[1538]: time="2025-03-17T17:30:30.095688594Z" level=info msg="StartContainer for \"d56785b9a3560bfe13d75517b54aee70ecfddbb23e40b61815011a3ff0803c4e\"" Mar 17 17:30:30.171127 containerd[1538]: time="2025-03-17T17:30:30.171074365Z" level=info msg="StartContainer for \"d56785b9a3560bfe13d75517b54aee70ecfddbb23e40b61815011a3ff0803c4e\" returns successfully" Mar 17 17:30:30.184879 kubelet[2762]: E0317 17:30:30.184409 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:30.265905 kubelet[2762]: E0317 17:30:30.265830 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:30.283162 kubelet[2762]: I0317 17:30:30.283079 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5f5dcc6d5-x8njv" podStartSLOduration=1.5515210590000001 podStartE2EDuration="3.283057777s" podCreationTimestamp="2025-03-17 17:30:27 +0000 UTC" firstStartedPulling="2025-03-17 17:30:28.322962099 +0000 UTC m=+25.218721988" lastFinishedPulling="2025-03-17 17:30:30.054498817 +0000 UTC m=+26.950258706" observedRunningTime="2025-03-17 17:30:30.278113361 +0000 UTC m=+27.173873250" watchObservedRunningTime="2025-03-17 17:30:30.283057777 +0000 UTC m=+27.178817666" Mar 17 17:30:30.364315 kubelet[2762]: E0317 17:30:30.364189 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.364315 kubelet[2762]: W0317 17:30:30.364216 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.364315 kubelet[2762]: E0317 17:30:30.364237 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.364616 kubelet[2762]: E0317 17:30:30.364601 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.364758 kubelet[2762]: W0317 17:30:30.364657 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.364758 kubelet[2762]: E0317 17:30:30.364674 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.364934 kubelet[2762]: E0317 17:30:30.364920 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.365089 kubelet[2762]: W0317 17:30:30.364989 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.365089 kubelet[2762]: E0317 17:30:30.365006 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.365231 kubelet[2762]: E0317 17:30:30.365217 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.365291 kubelet[2762]: W0317 17:30:30.365281 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.365349 kubelet[2762]: E0317 17:30:30.365338 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.365656 kubelet[2762]: E0317 17:30:30.365568 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.365656 kubelet[2762]: W0317 17:30:30.365580 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.365656 kubelet[2762]: E0317 17:30:30.365590 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.365815 kubelet[2762]: E0317 17:30:30.365803 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.365893 kubelet[2762]: W0317 17:30:30.365858 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.365955 kubelet[2762]: E0317 17:30:30.365945 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.366192 kubelet[2762]: E0317 17:30:30.366177 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.366362 kubelet[2762]: W0317 17:30:30.366266 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.366362 kubelet[2762]: E0317 17:30:30.366284 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.367197 kubelet[2762]: E0317 17:30:30.367048 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.367197 kubelet[2762]: W0317 17:30:30.367070 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.367197 kubelet[2762]: E0317 17:30:30.367083 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.369098 kubelet[2762]: E0317 17:30:30.368974 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.369098 kubelet[2762]: W0317 17:30:30.368991 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.369098 kubelet[2762]: E0317 17:30:30.369004 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.369395 kubelet[2762]: E0317 17:30:30.369293 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.369395 kubelet[2762]: W0317 17:30:30.369306 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.369395 kubelet[2762]: E0317 17:30:30.369317 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.369663 kubelet[2762]: E0317 17:30:30.369496 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.369663 kubelet[2762]: W0317 17:30:30.369506 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.369663 kubelet[2762]: E0317 17:30:30.369515 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.370325 kubelet[2762]: E0317 17:30:30.369785 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.370325 kubelet[2762]: W0317 17:30:30.369798 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.370325 kubelet[2762]: E0317 17:30:30.369809 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.370627 kubelet[2762]: E0317 17:30:30.370610 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.370691 kubelet[2762]: W0317 17:30:30.370679 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.370764 kubelet[2762]: E0317 17:30:30.370752 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.371143 kubelet[2762]: E0317 17:30:30.371034 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.371143 kubelet[2762]: W0317 17:30:30.371046 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.371143 kubelet[2762]: E0317 17:30:30.371056 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.371334 kubelet[2762]: E0317 17:30:30.371321 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.371400 kubelet[2762]: W0317 17:30:30.371387 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.371513 kubelet[2762]: E0317 17:30:30.371499 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.371835 kubelet[2762]: E0317 17:30:30.371821 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.371938 kubelet[2762]: W0317 17:30:30.371923 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.371989 kubelet[2762]: E0317 17:30:30.371980 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.372284 kubelet[2762]: E0317 17:30:30.372270 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.372347 kubelet[2762]: W0317 17:30:30.372336 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.372426 kubelet[2762]: E0317 17:30:30.372415 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.372702 kubelet[2762]: E0317 17:30:30.372677 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.372702 kubelet[2762]: W0317 17:30:30.372699 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.372772 kubelet[2762]: E0317 17:30:30.372718 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.372939 kubelet[2762]: E0317 17:30:30.372926 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.372939 kubelet[2762]: W0317 17:30:30.372937 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.373043 kubelet[2762]: E0317 17:30:30.372949 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.373105 kubelet[2762]: E0317 17:30:30.373094 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.373137 kubelet[2762]: W0317 17:30:30.373105 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.373137 kubelet[2762]: E0317 17:30:30.373114 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.373320 kubelet[2762]: E0317 17:30:30.373309 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.373320 kubelet[2762]: W0317 17:30:30.373320 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.373387 kubelet[2762]: E0317 17:30:30.373335 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.373732 kubelet[2762]: E0317 17:30:30.373579 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.373732 kubelet[2762]: W0317 17:30:30.373594 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.373732 kubelet[2762]: E0317 17:30:30.373612 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.373941 kubelet[2762]: E0317 17:30:30.373927 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.373995 kubelet[2762]: W0317 17:30:30.373984 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.374141 kubelet[2762]: E0317 17:30:30.374113 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.374344 kubelet[2762]: E0317 17:30:30.374332 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.374410 kubelet[2762]: W0317 17:30:30.374399 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.374500 kubelet[2762]: E0317 17:30:30.374468 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.374699 kubelet[2762]: E0317 17:30:30.374686 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.374880 kubelet[2762]: W0317 17:30:30.374746 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.374880 kubelet[2762]: E0317 17:30:30.374770 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.375025 kubelet[2762]: E0317 17:30:30.375012 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.375076 kubelet[2762]: W0317 17:30:30.375066 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.375139 kubelet[2762]: E0317 17:30:30.375128 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.375409 kubelet[2762]: E0317 17:30:30.375392 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.375409 kubelet[2762]: W0317 17:30:30.375406 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.375483 kubelet[2762]: E0317 17:30:30.375422 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.375568 kubelet[2762]: E0317 17:30:30.375555 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.375568 kubelet[2762]: W0317 17:30:30.375565 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.375624 kubelet[2762]: E0317 17:30:30.375577 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.375771 kubelet[2762]: E0317 17:30:30.375757 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.375771 kubelet[2762]: W0317 17:30:30.375768 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.375841 kubelet[2762]: E0317 17:30:30.375780 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.376251 kubelet[2762]: E0317 17:30:30.376090 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.376251 kubelet[2762]: W0317 17:30:30.376130 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.376251 kubelet[2762]: E0317 17:30:30.376149 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.376434 kubelet[2762]: E0317 17:30:30.376421 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.376489 kubelet[2762]: W0317 17:30:30.376479 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.376561 kubelet[2762]: E0317 17:30:30.376550 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.376794 kubelet[2762]: E0317 17:30:30.376763 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.376794 kubelet[2762]: W0317 17:30:30.376779 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.376794 kubelet[2762]: E0317 17:30:30.376794 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:30.377016 kubelet[2762]: E0317 17:30:30.377000 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:30.377016 kubelet[2762]: W0317 17:30:30.377013 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:30.377092 kubelet[2762]: E0317 17:30:30.377021 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.274758 kubelet[2762]: I0317 17:30:31.274711 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:31.275724 kubelet[2762]: E0317 17:30:31.275387 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:31.277013 kubelet[2762]: E0317 17:30:31.276994 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.277013 kubelet[2762]: W0317 17:30:31.277013 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.277108 kubelet[2762]: E0317 17:30:31.277032 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.277239 kubelet[2762]: E0317 17:30:31.277226 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.277239 kubelet[2762]: W0317 17:30:31.277239 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.277299 kubelet[2762]: E0317 17:30:31.277249 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.277437 kubelet[2762]: E0317 17:30:31.277425 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.277472 kubelet[2762]: W0317 17:30:31.277438 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.277472 kubelet[2762]: E0317 17:30:31.277448 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.277634 kubelet[2762]: E0317 17:30:31.277622 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.277663 kubelet[2762]: W0317 17:30:31.277635 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.277663 kubelet[2762]: E0317 17:30:31.277645 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.277831 kubelet[2762]: E0317 17:30:31.277819 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.277859 kubelet[2762]: W0317 17:30:31.277832 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.277859 kubelet[2762]: E0317 17:30:31.277841 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.278014 kubelet[2762]: E0317 17:30:31.278003 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.278014 kubelet[2762]: W0317 17:30:31.278015 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.278074 kubelet[2762]: E0317 17:30:31.278024 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.278221 kubelet[2762]: E0317 17:30:31.278209 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.278269 kubelet[2762]: W0317 17:30:31.278222 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.278269 kubelet[2762]: E0317 17:30:31.278235 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.278408 kubelet[2762]: E0317 17:30:31.278397 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.278452 kubelet[2762]: W0317 17:30:31.278409 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.278452 kubelet[2762]: E0317 17:30:31.278421 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.278596 kubelet[2762]: E0317 17:30:31.278584 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.278625 kubelet[2762]: W0317 17:30:31.278597 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.278625 kubelet[2762]: E0317 17:30:31.278606 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.278763 kubelet[2762]: E0317 17:30:31.278752 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.278810 kubelet[2762]: W0317 17:30:31.278764 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.278810 kubelet[2762]: E0317 17:30:31.278773 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.281003 kubelet[2762]: E0317 17:30:31.280979 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.281003 kubelet[2762]: W0317 17:30:31.280998 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.281066 kubelet[2762]: E0317 17:30:31.281013 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.281253 kubelet[2762]: E0317 17:30:31.281238 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.281289 kubelet[2762]: W0317 17:30:31.281254 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.281289 kubelet[2762]: E0317 17:30:31.281269 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.281540 kubelet[2762]: E0317 17:30:31.281527 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.281540 kubelet[2762]: W0317 17:30:31.281540 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.281604 kubelet[2762]: E0317 17:30:31.281550 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.281756 kubelet[2762]: E0317 17:30:31.281745 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.281790 kubelet[2762]: W0317 17:30:31.281756 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.281790 kubelet[2762]: E0317 17:30:31.281765 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.281991 kubelet[2762]: E0317 17:30:31.281980 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.281991 kubelet[2762]: W0317 17:30:31.281991 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.282038 kubelet[2762]: E0317 17:30:31.282000 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.282294 kubelet[2762]: E0317 17:30:31.282281 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.282294 kubelet[2762]: W0317 17:30:31.282294 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.282386 kubelet[2762]: E0317 17:30:31.282304 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.285541 kubelet[2762]: E0317 17:30:31.285374 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.285541 kubelet[2762]: W0317 17:30:31.285388 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.285541 kubelet[2762]: E0317 17:30:31.285401 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.285667 kubelet[2762]: E0317 17:30:31.285623 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.285667 kubelet[2762]: W0317 17:30:31.285634 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.285720 kubelet[2762]: E0317 17:30:31.285671 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.285894 kubelet[2762]: E0317 17:30:31.285861 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.285894 kubelet[2762]: W0317 17:30:31.285895 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.286005 kubelet[2762]: E0317 17:30:31.285990 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.286330 kubelet[2762]: E0317 17:30:31.286317 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.286330 kubelet[2762]: W0317 17:30:31.286329 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.286415 kubelet[2762]: E0317 17:30:31.286343 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.286588 kubelet[2762]: E0317 17:30:31.286575 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.286588 kubelet[2762]: W0317 17:30:31.286587 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.286650 kubelet[2762]: E0317 17:30:31.286623 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.287019 kubelet[2762]: E0317 17:30:31.287002 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.287019 kubelet[2762]: W0317 17:30:31.287018 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.287126 kubelet[2762]: E0317 17:30:31.287038 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.287310 kubelet[2762]: E0317 17:30:31.287294 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.287310 kubelet[2762]: W0317 17:30:31.287309 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.287376 kubelet[2762]: E0317 17:30:31.287352 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.287507 kubelet[2762]: E0317 17:30:31.287494 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.287507 kubelet[2762]: W0317 17:30:31.287505 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.287578 kubelet[2762]: E0317 17:30:31.287561 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.287853 kubelet[2762]: E0317 17:30:31.287731 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.287853 kubelet[2762]: W0317 17:30:31.287741 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.287853 kubelet[2762]: E0317 17:30:31.287763 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.289187 kubelet[2762]: E0317 17:30:31.289167 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.289187 kubelet[2762]: W0317 17:30:31.289185 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.289291 kubelet[2762]: E0317 17:30:31.289206 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.289446 kubelet[2762]: E0317 17:30:31.289431 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.289446 kubelet[2762]: W0317 17:30:31.289446 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.289523 kubelet[2762]: E0317 17:30:31.289499 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.289849 kubelet[2762]: E0317 17:30:31.289834 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.289849 kubelet[2762]: W0317 17:30:31.289848 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.289965 kubelet[2762]: E0317 17:30:31.289862 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.290078 kubelet[2762]: E0317 17:30:31.290065 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.290078 kubelet[2762]: W0317 17:30:31.290078 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.290163 kubelet[2762]: E0317 17:30:31.290108 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.290364 kubelet[2762]: E0317 17:30:31.290349 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.290364 kubelet[2762]: W0317 17:30:31.290363 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.290443 kubelet[2762]: E0317 17:30:31.290379 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.290721 kubelet[2762]: E0317 17:30:31.290705 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.290721 kubelet[2762]: W0317 17:30:31.290720 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.290787 kubelet[2762]: E0317 17:30:31.290731 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.291038 kubelet[2762]: E0317 17:30:31.291022 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.291038 kubelet[2762]: W0317 17:30:31.291038 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.291106 kubelet[2762]: E0317 17:30:31.291051 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.302065 kubelet[2762]: E0317 17:30:31.301072 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:30:31.302065 kubelet[2762]: W0317 17:30:31.301100 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:30:31.302065 kubelet[2762]: E0317 17:30:31.301121 2762 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:30:31.614810 containerd[1538]: time="2025-03-17T17:30:31.614758170Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:31.615461 containerd[1538]: time="2025-03-17T17:30:31.615402132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 17 17:30:31.616318 containerd[1538]: time="2025-03-17T17:30:31.616287495Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:31.619014 containerd[1538]: time="2025-03-17T17:30:31.618973064Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:31.619548 containerd[1538]: time="2025-03-17T17:30:31.619508985Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.563831724s" Mar 17 17:30:31.619548 containerd[1538]: time="2025-03-17T17:30:31.619544266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 17 17:30:31.623108 containerd[1538]: time="2025-03-17T17:30:31.623070997Z" level=info msg="CreateContainer within sandbox \"973eb51fe8ba3ea79aa48220786a579c75ccd065878c01a9dd9ecd0fb1d93625\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:30:31.643413 containerd[1538]: time="2025-03-17T17:30:31.643359182Z" level=info msg="CreateContainer within sandbox \"973eb51fe8ba3ea79aa48220786a579c75ccd065878c01a9dd9ecd0fb1d93625\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"4f481fd0ebd8fe118cdd7b1491711cfaec35eeb21645d593ba0c56c9e484a750\"" Mar 17 17:30:31.645099 containerd[1538]: time="2025-03-17T17:30:31.644083144Z" level=info msg="StartContainer for \"4f481fd0ebd8fe118cdd7b1491711cfaec35eeb21645d593ba0c56c9e484a750\"" Mar 17 17:30:31.694946 containerd[1538]: time="2025-03-17T17:30:31.694904707Z" level=info msg="StartContainer for \"4f481fd0ebd8fe118cdd7b1491711cfaec35eeb21645d593ba0c56c9e484a750\" returns successfully" Mar 17 17:30:31.780223 systemd[1]: Started sshd@7-10.0.0.79:22-10.0.0.1:49616.service - OpenSSH per-connection server daemon (10.0.0.1:49616). Mar 17 17:30:31.788246 containerd[1538]: time="2025-03-17T17:30:31.770560749Z" level=info msg="shim disconnected" id=4f481fd0ebd8fe118cdd7b1491711cfaec35eeb21645d593ba0c56c9e484a750 namespace=k8s.io Mar 17 17:30:31.788408 containerd[1538]: time="2025-03-17T17:30:31.788260246Z" level=warning msg="cleaning up after shim disconnected" id=4f481fd0ebd8fe118cdd7b1491711cfaec35eeb21645d593ba0c56c9e484a750 namespace=k8s.io Mar 17 17:30:31.788408 containerd[1538]: time="2025-03-17T17:30:31.788277006Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:30:31.831055 sshd[3512]: Accepted publickey for core from 10.0.0.1 port 49616 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:31.832274 sshd-session[3512]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:31.836375 systemd-logind[1523]: New session 8 of user core. Mar 17 17:30:31.843192 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 17:30:31.970058 sshd[3528]: Connection closed by 10.0.0.1 port 49616 Mar 17 17:30:31.970401 sshd-session[3512]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:31.974657 systemd[1]: sshd@7-10.0.0.79:22-10.0.0.1:49616.service: Deactivated successfully. Mar 17 17:30:31.976923 systemd-logind[1523]: Session 8 logged out. Waiting for processes to exit. Mar 17 17:30:31.976945 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 17:30:31.978270 systemd-logind[1523]: Removed session 8. Mar 17 17:30:32.064216 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4f481fd0ebd8fe118cdd7b1491711cfaec35eeb21645d593ba0c56c9e484a750-rootfs.mount: Deactivated successfully. Mar 17 17:30:32.185077 kubelet[2762]: E0317 17:30:32.185025 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:32.271887 kubelet[2762]: E0317 17:30:32.271778 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:32.272966 containerd[1538]: time="2025-03-17T17:30:32.272930766Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 17:30:34.185129 kubelet[2762]: E0317 17:30:34.185083 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:35.019381 containerd[1538]: time="2025-03-17T17:30:35.019323231Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:35.020027 containerd[1538]: time="2025-03-17T17:30:35.019957272Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 17 17:30:35.020568 containerd[1538]: time="2025-03-17T17:30:35.020526754Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:35.022706 containerd[1538]: time="2025-03-17T17:30:35.022672640Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:35.023589 containerd[1538]: time="2025-03-17T17:30:35.023555802Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 2.750586116s" Mar 17 17:30:35.023729 containerd[1538]: time="2025-03-17T17:30:35.023593202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 17 17:30:35.028639 containerd[1538]: time="2025-03-17T17:30:35.028602576Z" level=info msg="CreateContainer within sandbox \"973eb51fe8ba3ea79aa48220786a579c75ccd065878c01a9dd9ecd0fb1d93625\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:30:35.042653 containerd[1538]: time="2025-03-17T17:30:35.042542375Z" level=info msg="CreateContainer within sandbox \"973eb51fe8ba3ea79aa48220786a579c75ccd065878c01a9dd9ecd0fb1d93625\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"39754d9e6d3ac05f2683571e4ffba36796271299465c281d4858f666ddffd7fa\"" Mar 17 17:30:35.043098 containerd[1538]: time="2025-03-17T17:30:35.043068496Z" level=info msg="StartContainer for \"39754d9e6d3ac05f2683571e4ffba36796271299465c281d4858f666ddffd7fa\"" Mar 17 17:30:35.102497 containerd[1538]: time="2025-03-17T17:30:35.102443541Z" level=info msg="StartContainer for \"39754d9e6d3ac05f2683571e4ffba36796271299465c281d4858f666ddffd7fa\" returns successfully" Mar 17 17:30:35.279405 kubelet[2762]: E0317 17:30:35.278979 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:35.833917 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-39754d9e6d3ac05f2683571e4ffba36796271299465c281d4858f666ddffd7fa-rootfs.mount: Deactivated successfully. Mar 17 17:30:35.862563 kubelet[2762]: I0317 17:30:35.862519 2762 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 17:30:35.903437 containerd[1538]: time="2025-03-17T17:30:35.903359165Z" level=info msg="shim disconnected" id=39754d9e6d3ac05f2683571e4ffba36796271299465c281d4858f666ddffd7fa namespace=k8s.io Mar 17 17:30:35.903437 containerd[1538]: time="2025-03-17T17:30:35.903423725Z" level=warning msg="cleaning up after shim disconnected" id=39754d9e6d3ac05f2683571e4ffba36796271299465c281d4858f666ddffd7fa namespace=k8s.io Mar 17 17:30:35.903437 containerd[1538]: time="2025-03-17T17:30:35.903434445Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:30:35.947112 kubelet[2762]: I0317 17:30:35.947057 2762 topology_manager.go:215] "Topology Admit Handler" podUID="511a8295-1573-4289-aeb3-f68df5d31b5a" podNamespace="calico-apiserver" podName="calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:35.947532 kubelet[2762]: I0317 17:30:35.947478 2762 topology_manager.go:215] "Topology Admit Handler" podUID="2e8bd960-5c0f-417d-9ed7-865fc7dd73d5" podNamespace="calico-system" podName="calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:35.957175 kubelet[2762]: I0317 17:30:35.955937 2762 topology_manager.go:215] "Topology Admit Handler" podUID="abd10042-888c-4822-b83a-6040f4449647" podNamespace="kube-system" podName="coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:35.959755 kubelet[2762]: I0317 17:30:35.957536 2762 topology_manager.go:215] "Topology Admit Handler" podUID="ff9f13ce-0246-4ffb-9085-bff61711b9fb" podNamespace="kube-system" podName="coredns-7db6d8ff4d-zx22h" Mar 17 17:30:35.959755 kubelet[2762]: I0317 17:30:35.957718 2762 topology_manager.go:215] "Topology Admit Handler" podUID="dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d" podNamespace="calico-apiserver" podName="calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:36.057388 kubelet[2762]: I0317 17:30:36.057343 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e8bd960-5c0f-417d-9ed7-865fc7dd73d5-tigera-ca-bundle\") pod \"calico-kube-controllers-586bfd9c56-zr9qf\" (UID: \"2e8bd960-5c0f-417d-9ed7-865fc7dd73d5\") " pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:36.057591 kubelet[2762]: I0317 17:30:36.057573 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfn6r\" (UniqueName: \"kubernetes.io/projected/2e8bd960-5c0f-417d-9ed7-865fc7dd73d5-kube-api-access-pfn6r\") pod \"calico-kube-controllers-586bfd9c56-zr9qf\" (UID: \"2e8bd960-5c0f-417d-9ed7-865fc7dd73d5\") " pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:36.057682 kubelet[2762]: I0317 17:30:36.057668 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/511a8295-1573-4289-aeb3-f68df5d31b5a-calico-apiserver-certs\") pod \"calico-apiserver-57698566f9-jhj7p\" (UID: \"511a8295-1573-4289-aeb3-f68df5d31b5a\") " pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:36.057752 kubelet[2762]: I0317 17:30:36.057740 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblqb\" (UniqueName: \"kubernetes.io/projected/511a8295-1573-4289-aeb3-f68df5d31b5a-kube-api-access-nblqb\") pod \"calico-apiserver-57698566f9-jhj7p\" (UID: \"511a8295-1573-4289-aeb3-f68df5d31b5a\") " pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:36.158854 kubelet[2762]: I0317 17:30:36.158728 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff9f13ce-0246-4ffb-9085-bff61711b9fb-config-volume\") pod \"coredns-7db6d8ff4d-zx22h\" (UID: \"ff9f13ce-0246-4ffb-9085-bff61711b9fb\") " pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:36.158854 kubelet[2762]: I0317 17:30:36.158825 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/abd10042-888c-4822-b83a-6040f4449647-config-volume\") pod \"coredns-7db6d8ff4d-8tsrg\" (UID: \"abd10042-888c-4822-b83a-6040f4449647\") " pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:36.158854 kubelet[2762]: I0317 17:30:36.158848 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89bh8\" (UniqueName: \"kubernetes.io/projected/abd10042-888c-4822-b83a-6040f4449647-kube-api-access-89bh8\") pod \"coredns-7db6d8ff4d-8tsrg\" (UID: \"abd10042-888c-4822-b83a-6040f4449647\") " pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:36.159042 kubelet[2762]: I0317 17:30:36.158886 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj646\" (UniqueName: \"kubernetes.io/projected/ff9f13ce-0246-4ffb-9085-bff61711b9fb-kube-api-access-rj646\") pod \"coredns-7db6d8ff4d-zx22h\" (UID: \"ff9f13ce-0246-4ffb-9085-bff61711b9fb\") " pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:36.159042 kubelet[2762]: I0317 17:30:36.158909 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d-calico-apiserver-certs\") pod \"calico-apiserver-57698566f9-8n7jx\" (UID: \"dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d\") " pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:36.159042 kubelet[2762]: I0317 17:30:36.158927 2762 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw2rz\" (UniqueName: \"kubernetes.io/projected/dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d-kube-api-access-kw2rz\") pod \"calico-apiserver-57698566f9-8n7jx\" (UID: \"dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d\") " pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:36.187905 containerd[1538]: time="2025-03-17T17:30:36.187838658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:36.265520 containerd[1538]: time="2025-03-17T17:30:36.261941417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:0,}" Mar 17 17:30:36.273272 kubelet[2762]: E0317 17:30:36.273104 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:36.277905 containerd[1538]: time="2025-03-17T17:30:36.273951089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:0,}" Mar 17 17:30:36.277905 containerd[1538]: time="2025-03-17T17:30:36.275118773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:30:36.284330 kubelet[2762]: E0317 17:30:36.284295 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:36.286213 containerd[1538]: time="2025-03-17T17:30:36.286171442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 17:30:36.454577 containerd[1538]: time="2025-03-17T17:30:36.454116653Z" level=error msg="Failed to destroy network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.456910 containerd[1538]: time="2025-03-17T17:30:36.456851821Z" level=error msg="Failed to destroy network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.457289 containerd[1538]: time="2025-03-17T17:30:36.457243182Z" level=error msg="encountered an error cleaning up failed sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.457358 containerd[1538]: time="2025-03-17T17:30:36.457324822Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.457842 containerd[1538]: time="2025-03-17T17:30:36.457743943Z" level=error msg="Failed to destroy network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.458033 containerd[1538]: time="2025-03-17T17:30:36.458008544Z" level=error msg="encountered an error cleaning up failed sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.458089 containerd[1538]: time="2025-03-17T17:30:36.458059304Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.459436 containerd[1538]: time="2025-03-17T17:30:36.459401188Z" level=error msg="encountered an error cleaning up failed sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.459576 containerd[1538]: time="2025-03-17T17:30:36.459555628Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.459644 kubelet[2762]: E0317 17:30:36.459574 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.459686 kubelet[2762]: E0317 17:30:36.459651 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:36.459708 kubelet[2762]: E0317 17:30:36.459690 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:36.460099 kubelet[2762]: E0317 17:30:36.459737 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:36.460099 kubelet[2762]: E0317 17:30:36.459913 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.460099 kubelet[2762]: E0317 17:30:36.459950 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:36.460243 kubelet[2762]: E0317 17:30:36.459968 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:36.460243 kubelet[2762]: E0317 17:30:36.460012 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" podUID="2e8bd960-5c0f-417d-9ed7-865fc7dd73d5" Mar 17 17:30:36.460352 kubelet[2762]: E0317 17:30:36.460307 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.460398 kubelet[2762]: E0317 17:30:36.460353 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:36.460398 kubelet[2762]: E0317 17:30:36.460368 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:36.460447 kubelet[2762]: E0317 17:30:36.460400 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" podUID="511a8295-1573-4289-aeb3-f68df5d31b5a" Mar 17 17:30:36.468058 containerd[1538]: time="2025-03-17T17:30:36.468011051Z" level=error msg="Failed to destroy network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.468362 containerd[1538]: time="2025-03-17T17:30:36.468329852Z" level=error msg="encountered an error cleaning up failed sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.468423 containerd[1538]: time="2025-03-17T17:30:36.468395132Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.468650 kubelet[2762]: E0317 17:30:36.468610 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.468719 kubelet[2762]: E0317 17:30:36.468669 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:36.468719 kubelet[2762]: E0317 17:30:36.468688 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:36.468785 kubelet[2762]: E0317 17:30:36.468734 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zx22h" podUID="ff9f13ce-0246-4ffb-9085-bff61711b9fb" Mar 17 17:30:36.510144 kubelet[2762]: I0317 17:30:36.509832 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:36.510565 kubelet[2762]: E0317 17:30:36.510529 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:36.571693 kubelet[2762]: E0317 17:30:36.571656 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:36.572443 containerd[1538]: time="2025-03-17T17:30:36.572255611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:0,}" Mar 17 17:30:36.575277 containerd[1538]: time="2025-03-17T17:30:36.575007618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:30:36.643763 containerd[1538]: time="2025-03-17T17:30:36.643717603Z" level=error msg="Failed to destroy network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.644540 containerd[1538]: time="2025-03-17T17:30:36.644491885Z" level=error msg="Failed to destroy network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.644815 containerd[1538]: time="2025-03-17T17:30:36.644511445Z" level=error msg="encountered an error cleaning up failed sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.644815 containerd[1538]: time="2025-03-17T17:30:36.644695725Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.644940 containerd[1538]: time="2025-03-17T17:30:36.644804526Z" level=error msg="encountered an error cleaning up failed sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.644940 containerd[1538]: time="2025-03-17T17:30:36.644848246Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.645131 kubelet[2762]: E0317 17:30:36.645071 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.645186 kubelet[2762]: E0317 17:30:36.645148 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:36.645211 kubelet[2762]: E0317 17:30:36.645085 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:36.645239 kubelet[2762]: E0317 17:30:36.645213 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:36.645263 kubelet[2762]: E0317 17:30:36.645233 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:36.645301 kubelet[2762]: E0317 17:30:36.645273 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8tsrg" podUID="abd10042-888c-4822-b83a-6040f4449647" Mar 17 17:30:36.645510 kubelet[2762]: E0317 17:30:36.645171 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:36.646893 kubelet[2762]: E0317 17:30:36.646836 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" podUID="dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d" Mar 17 17:30:36.987112 systemd[1]: Started sshd@8-10.0.0.79:22-10.0.0.1:57432.service - OpenSSH per-connection server daemon (10.0.0.1:57432). Mar 17 17:30:37.031942 sshd[3855]: Accepted publickey for core from 10.0.0.1 port 57432 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:37.033355 sshd-session[3855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:37.037911 systemd-logind[1523]: New session 9 of user core. Mar 17 17:30:37.047191 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 17:30:37.163584 sshd[3858]: Connection closed by 10.0.0.1 port 57432 Mar 17 17:30:37.162701 sshd-session[3855]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:37.172081 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120-shm.mount: Deactivated successfully. Mar 17 17:30:37.174853 systemd[1]: sshd@8-10.0.0.79:22-10.0.0.1:57432.service: Deactivated successfully. Mar 17 17:30:37.177331 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 17:30:37.178018 systemd-logind[1523]: Session 9 logged out. Waiting for processes to exit. Mar 17 17:30:37.179394 systemd-logind[1523]: Removed session 9. Mar 17 17:30:37.286726 kubelet[2762]: I0317 17:30:37.286604 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e" Mar 17 17:30:37.288897 containerd[1538]: time="2025-03-17T17:30:37.288043669Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:30:37.288897 containerd[1538]: time="2025-03-17T17:30:37.288254470Z" level=info msg="Ensure that sandbox 1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e in task-service has been cleanup successfully" Mar 17 17:30:37.289945 kubelet[2762]: I0317 17:30:37.289918 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1" Mar 17 17:30:37.290657 containerd[1538]: time="2025-03-17T17:30:37.290402756Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:30:37.290657 containerd[1538]: time="2025-03-17T17:30:37.290552836Z" level=info msg="Ensure that sandbox cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1 in task-service has been cleanup successfully" Mar 17 17:30:37.291152 containerd[1538]: time="2025-03-17T17:30:37.290984837Z" level=info msg="TearDown network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" successfully" Mar 17 17:30:37.291152 containerd[1538]: time="2025-03-17T17:30:37.291010957Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" returns successfully" Mar 17 17:30:37.291152 containerd[1538]: time="2025-03-17T17:30:37.291071157Z" level=info msg="TearDown network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" successfully" Mar 17 17:30:37.291152 containerd[1538]: time="2025-03-17T17:30:37.291086917Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" returns successfully" Mar 17 17:30:37.291180 systemd[1]: run-netns-cni\x2dd15f99ac\x2df45f\x2d6d53\x2dd2ab\x2db32d683fd1b0.mount: Deactivated successfully. Mar 17 17:30:37.291730 containerd[1538]: time="2025-03-17T17:30:37.291559039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:1,}" Mar 17 17:30:37.291730 containerd[1538]: time="2025-03-17T17:30:37.291585279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:30:37.292938 kubelet[2762]: I0317 17:30:37.292911 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120" Mar 17 17:30:37.293479 containerd[1538]: time="2025-03-17T17:30:37.293372323Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:30:37.293534 containerd[1538]: time="2025-03-17T17:30:37.293518644Z" level=info msg="Ensure that sandbox 8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120 in task-service has been cleanup successfully" Mar 17 17:30:37.293781 containerd[1538]: time="2025-03-17T17:30:37.293661844Z" level=info msg="TearDown network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" successfully" Mar 17 17:30:37.293781 containerd[1538]: time="2025-03-17T17:30:37.293680804Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" returns successfully" Mar 17 17:30:37.294287 kubelet[2762]: I0317 17:30:37.294268 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89" Mar 17 17:30:37.295056 containerd[1538]: time="2025-03-17T17:30:37.294779127Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:30:37.294903 systemd[1]: run-netns-cni\x2d3dd40e7d\x2d17cb\x2d7869\x2ded91\x2d00e986686852.mount: Deactivated successfully. Mar 17 17:30:37.295691 containerd[1538]: time="2025-03-17T17:30:37.295419889Z" level=info msg="Ensure that sandbox a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89 in task-service has been cleanup successfully" Mar 17 17:30:37.295846 kubelet[2762]: I0317 17:30:37.295775 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832" Mar 17 17:30:37.296037 containerd[1538]: time="2025-03-17T17:30:37.295960490Z" level=info msg="TearDown network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" successfully" Mar 17 17:30:37.296037 containerd[1538]: time="2025-03-17T17:30:37.295981770Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" returns successfully" Mar 17 17:30:37.296446 containerd[1538]: time="2025-03-17T17:30:37.296337451Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:30:37.296598 kubelet[2762]: E0317 17:30:37.296482 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:37.297333 containerd[1538]: time="2025-03-17T17:30:37.296983373Z" level=info msg="Ensure that sandbox 6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832 in task-service has been cleanup successfully" Mar 17 17:30:37.297405 kubelet[2762]: I0317 17:30:37.297253 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a" Mar 17 17:30:37.297740 containerd[1538]: time="2025-03-17T17:30:37.297171893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:1,}" Mar 17 17:30:37.298038 containerd[1538]: time="2025-03-17T17:30:37.297985095Z" level=info msg="TearDown network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" successfully" Mar 17 17:30:37.298038 containerd[1538]: time="2025-03-17T17:30:37.298004975Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" returns successfully" Mar 17 17:30:37.298541 containerd[1538]: time="2025-03-17T17:30:37.298283616Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:30:37.298541 containerd[1538]: time="2025-03-17T17:30:37.298409296Z" level=info msg="Ensure that sandbox 6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a in task-service has been cleanup successfully" Mar 17 17:30:37.298535 systemd[1]: run-netns-cni\x2dd82039ce\x2d7253\x2d219e\x2d4c20\x2d5afbfb297d07.mount: Deactivated successfully. Mar 17 17:30:37.298669 kubelet[2762]: E0317 17:30:37.298275 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:37.298647 systemd[1]: run-netns-cni\x2d1f85fabd\x2d9747\x2d7afe\x2d1e0c\x2d8dd49f1fef17.mount: Deactivated successfully. Mar 17 17:30:37.299207 containerd[1538]: time="2025-03-17T17:30:37.299058618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:30:37.299381 containerd[1538]: time="2025-03-17T17:30:37.299325739Z" level=info msg="TearDown network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" successfully" Mar 17 17:30:37.299381 containerd[1538]: time="2025-03-17T17:30:37.299344859Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" returns successfully" Mar 17 17:30:37.299814 kubelet[2762]: E0317 17:30:37.299714 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:37.300220 containerd[1538]: time="2025-03-17T17:30:37.300003941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:1,}" Mar 17 17:30:37.301477 systemd[1]: run-netns-cni\x2daaa9f791\x2db59f\x2de26a\x2d7cfb\x2d7d48521a8fe6.mount: Deactivated successfully. Mar 17 17:30:37.301590 systemd[1]: run-netns-cni\x2d66c48149\x2dc314\x2d08f9\x2ddec5\x2dad4b1986cd9b.mount: Deactivated successfully. Mar 17 17:30:37.306177 containerd[1538]: time="2025-03-17T17:30:37.306144197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:1,}" Mar 17 17:30:37.466736 containerd[1538]: time="2025-03-17T17:30:37.466690094Z" level=error msg="Failed to destroy network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.471234 containerd[1538]: time="2025-03-17T17:30:37.471174346Z" level=error msg="encountered an error cleaning up failed sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.471353 containerd[1538]: time="2025-03-17T17:30:37.471258586Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.471504 kubelet[2762]: E0317 17:30:37.471467 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.471602 kubelet[2762]: E0317 17:30:37.471520 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:37.471602 kubelet[2762]: E0317 17:30:37.471542 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:37.471602 kubelet[2762]: E0317 17:30:37.471584 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" podUID="2e8bd960-5c0f-417d-9ed7-865fc7dd73d5" Mar 17 17:30:37.487955 containerd[1538]: time="2025-03-17T17:30:37.487855109Z" level=error msg="Failed to destroy network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.489471 containerd[1538]: time="2025-03-17T17:30:37.489365873Z" level=error msg="encountered an error cleaning up failed sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.489471 containerd[1538]: time="2025-03-17T17:30:37.489433953Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.489735 kubelet[2762]: E0317 17:30:37.489689 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.489795 kubelet[2762]: E0317 17:30:37.489761 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:37.489795 kubelet[2762]: E0317 17:30:37.489786 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:37.489853 kubelet[2762]: E0317 17:30:37.489830 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" podUID="dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d" Mar 17 17:30:37.502230 containerd[1538]: time="2025-03-17T17:30:37.502096946Z" level=error msg="Failed to destroy network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.503876 containerd[1538]: time="2025-03-17T17:30:37.503800631Z" level=error msg="encountered an error cleaning up failed sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.504260 containerd[1538]: time="2025-03-17T17:30:37.504112312Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.504640 kubelet[2762]: E0317 17:30:37.504575 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.504640 kubelet[2762]: E0317 17:30:37.504635 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:37.505026 kubelet[2762]: E0317 17:30:37.504656 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:37.505026 kubelet[2762]: E0317 17:30:37.504711 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" podUID="511a8295-1573-4289-aeb3-f68df5d31b5a" Mar 17 17:30:37.508085 containerd[1538]: time="2025-03-17T17:30:37.507838521Z" level=error msg="Failed to destroy network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.508462 containerd[1538]: time="2025-03-17T17:30:37.508423723Z" level=error msg="encountered an error cleaning up failed sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.508517 containerd[1538]: time="2025-03-17T17:30:37.508498043Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.508746 kubelet[2762]: E0317 17:30:37.508715 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.509054 kubelet[2762]: E0317 17:30:37.508857 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:37.509054 kubelet[2762]: E0317 17:30:37.508893 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:37.509054 kubelet[2762]: E0317 17:30:37.508937 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zx22h" podUID="ff9f13ce-0246-4ffb-9085-bff61711b9fb" Mar 17 17:30:37.517049 containerd[1538]: time="2025-03-17T17:30:37.516995345Z" level=error msg="Failed to destroy network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.517447 containerd[1538]: time="2025-03-17T17:30:37.517359546Z" level=error msg="encountered an error cleaning up failed sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.517447 containerd[1538]: time="2025-03-17T17:30:37.517415626Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.517770 kubelet[2762]: E0317 17:30:37.517596 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.517770 kubelet[2762]: E0317 17:30:37.517648 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:37.517770 kubelet[2762]: E0317 17:30:37.517666 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:37.518064 kubelet[2762]: E0317 17:30:37.517700 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:37.534738 containerd[1538]: time="2025-03-17T17:30:37.534676951Z" level=error msg="Failed to destroy network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.535462 containerd[1538]: time="2025-03-17T17:30:37.535417713Z" level=error msg="encountered an error cleaning up failed sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.535514 containerd[1538]: time="2025-03-17T17:30:37.535488913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.535774 kubelet[2762]: E0317 17:30:37.535723 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:37.535820 kubelet[2762]: E0317 17:30:37.535787 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:37.535820 kubelet[2762]: E0317 17:30:37.535806 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:37.536102 kubelet[2762]: E0317 17:30:37.535848 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8tsrg" podUID="abd10042-888c-4822-b83a-6040f4449647" Mar 17 17:30:38.300346 kubelet[2762]: I0317 17:30:38.300316 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8" Mar 17 17:30:38.302460 containerd[1538]: time="2025-03-17T17:30:38.302094684Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" Mar 17 17:30:38.302460 containerd[1538]: time="2025-03-17T17:30:38.302267244Z" level=info msg="Ensure that sandbox 5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8 in task-service has been cleanup successfully" Mar 17 17:30:38.303797 kubelet[2762]: I0317 17:30:38.303417 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5" Mar 17 17:30:38.303886 containerd[1538]: time="2025-03-17T17:30:38.303424327Z" level=info msg="TearDown network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" successfully" Mar 17 17:30:38.303886 containerd[1538]: time="2025-03-17T17:30:38.303445087Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" returns successfully" Mar 17 17:30:38.304461 containerd[1538]: time="2025-03-17T17:30:38.304096929Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" Mar 17 17:30:38.304461 containerd[1538]: time="2025-03-17T17:30:38.304255129Z" level=info msg="Ensure that sandbox e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5 in task-service has been cleanup successfully" Mar 17 17:30:38.304772 containerd[1538]: time="2025-03-17T17:30:38.304650290Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:30:38.304772 containerd[1538]: time="2025-03-17T17:30:38.304723811Z" level=info msg="TearDown network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" successfully" Mar 17 17:30:38.304772 containerd[1538]: time="2025-03-17T17:30:38.304733971Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" returns successfully" Mar 17 17:30:38.305123 systemd[1]: run-netns-cni\x2deea0476f\x2d96dd\x2da3e7\x2d308c\x2d5e9ebeda311c.mount: Deactivated successfully. Mar 17 17:30:38.305691 kubelet[2762]: E0317 17:30:38.305293 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:38.305725 containerd[1538]: time="2025-03-17T17:30:38.305137092Z" level=info msg="TearDown network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" successfully" Mar 17 17:30:38.305725 containerd[1538]: time="2025-03-17T17:30:38.305157212Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" returns successfully" Mar 17 17:30:38.306957 containerd[1538]: time="2025-03-17T17:30:38.306922736Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:30:38.307378 containerd[1538]: time="2025-03-17T17:30:38.307035456Z" level=info msg="TearDown network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" successfully" Mar 17 17:30:38.307378 containerd[1538]: time="2025-03-17T17:30:38.307046536Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" returns successfully" Mar 17 17:30:38.307378 containerd[1538]: time="2025-03-17T17:30:38.307147897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:2,}" Mar 17 17:30:38.308443 systemd[1]: run-netns-cni\x2d31040b60\x2d47ec\x2de5b1\x2dd158\x2dc0074b5821e3.mount: Deactivated successfully. Mar 17 17:30:38.311235 kubelet[2762]: I0317 17:30:38.310172 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544" Mar 17 17:30:38.311303 containerd[1538]: time="2025-03-17T17:30:38.310122784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:2,}" Mar 17 17:30:38.311677 containerd[1538]: time="2025-03-17T17:30:38.311629228Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" Mar 17 17:30:38.312921 containerd[1538]: time="2025-03-17T17:30:38.312419630Z" level=info msg="Ensure that sandbox 2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544 in task-service has been cleanup successfully" Mar 17 17:30:38.313333 kubelet[2762]: I0317 17:30:38.313290 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688" Mar 17 17:30:38.313414 containerd[1538]: time="2025-03-17T17:30:38.313389072Z" level=info msg="TearDown network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" successfully" Mar 17 17:30:38.313453 containerd[1538]: time="2025-03-17T17:30:38.313412792Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" returns successfully" Mar 17 17:30:38.313846 containerd[1538]: time="2025-03-17T17:30:38.313811473Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" Mar 17 17:30:38.314030 containerd[1538]: time="2025-03-17T17:30:38.313987994Z" level=info msg="Ensure that sandbox 101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688 in task-service has been cleanup successfully" Mar 17 17:30:38.314293 containerd[1538]: time="2025-03-17T17:30:38.314271155Z" level=info msg="TearDown network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" successfully" Mar 17 17:30:38.314293 containerd[1538]: time="2025-03-17T17:30:38.314289955Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" returns successfully" Mar 17 17:30:38.315886 systemd[1]: run-netns-cni\x2d910b9c33\x2dcff5\x2da30d\x2dfefb\x2d1ebabd5751f1.mount: Deactivated successfully. Mar 17 17:30:38.316021 systemd[1]: run-netns-cni\x2d57b89bf8\x2dea0d\x2db61f\x2d443e\x2d27290bb78a25.mount: Deactivated successfully. Mar 17 17:30:38.318494 containerd[1538]: time="2025-03-17T17:30:38.318458045Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:30:38.318494 containerd[1538]: time="2025-03-17T17:30:38.318484965Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:30:38.318601 containerd[1538]: time="2025-03-17T17:30:38.318563485Z" level=info msg="TearDown network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" successfully" Mar 17 17:30:38.318601 containerd[1538]: time="2025-03-17T17:30:38.318575406Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" returns successfully" Mar 17 17:30:38.320916 containerd[1538]: time="2025-03-17T17:30:38.319067287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:2,}" Mar 17 17:30:38.320992 containerd[1538]: time="2025-03-17T17:30:38.320351890Z" level=info msg="TearDown network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" successfully" Mar 17 17:30:38.321234 containerd[1538]: time="2025-03-17T17:30:38.320996292Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" returns successfully" Mar 17 17:30:38.322391 containerd[1538]: time="2025-03-17T17:30:38.322360295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:30:38.322722 kubelet[2762]: I0317 17:30:38.322689 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de" Mar 17 17:30:38.323361 containerd[1538]: time="2025-03-17T17:30:38.323333418Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" Mar 17 17:30:38.323534 containerd[1538]: time="2025-03-17T17:30:38.323514778Z" level=info msg="Ensure that sandbox 347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de in task-service has been cleanup successfully" Mar 17 17:30:38.324511 containerd[1538]: time="2025-03-17T17:30:38.324482260Z" level=info msg="TearDown network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" successfully" Mar 17 17:30:38.324795 containerd[1538]: time="2025-03-17T17:30:38.324776541Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" returns successfully" Mar 17 17:30:38.325612 kubelet[2762]: I0317 17:30:38.325584 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128" Mar 17 17:30:38.326131 containerd[1538]: time="2025-03-17T17:30:38.326087184Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:30:38.326226 containerd[1538]: time="2025-03-17T17:30:38.326211145Z" level=info msg="TearDown network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" successfully" Mar 17 17:30:38.326268 containerd[1538]: time="2025-03-17T17:30:38.326225865Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" returns successfully" Mar 17 17:30:38.326460 kubelet[2762]: E0317 17:30:38.326442 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:38.326551 containerd[1538]: time="2025-03-17T17:30:38.326517706Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" Mar 17 17:30:38.327817 containerd[1538]: time="2025-03-17T17:30:38.326965187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:2,}" Mar 17 17:30:38.328052 containerd[1538]: time="2025-03-17T17:30:38.328019309Z" level=info msg="Ensure that sandbox 02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128 in task-service has been cleanup successfully" Mar 17 17:30:38.329403 containerd[1538]: time="2025-03-17T17:30:38.329371233Z" level=info msg="TearDown network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" successfully" Mar 17 17:30:38.329403 containerd[1538]: time="2025-03-17T17:30:38.329396113Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" returns successfully" Mar 17 17:30:38.329903 containerd[1538]: time="2025-03-17T17:30:38.329861834Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:30:38.329983 containerd[1538]: time="2025-03-17T17:30:38.329964194Z" level=info msg="TearDown network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" successfully" Mar 17 17:30:38.329983 containerd[1538]: time="2025-03-17T17:30:38.329980714Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" returns successfully" Mar 17 17:30:38.330476 containerd[1538]: time="2025-03-17T17:30:38.330446315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:30:38.842556 containerd[1538]: time="2025-03-17T17:30:38.842493087Z" level=error msg="Failed to destroy network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.843881 containerd[1538]: time="2025-03-17T17:30:38.843709970Z" level=error msg="Failed to destroy network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.848926 containerd[1538]: time="2025-03-17T17:30:38.847956021Z" level=error msg="encountered an error cleaning up failed sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.848926 containerd[1538]: time="2025-03-17T17:30:38.848044981Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.849071 kubelet[2762]: E0317 17:30:38.848288 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.849071 kubelet[2762]: E0317 17:30:38.848351 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:38.849071 kubelet[2762]: E0317 17:30:38.848374 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:38.849198 kubelet[2762]: E0317 17:30:38.848413 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zx22h" podUID="ff9f13ce-0246-4ffb-9085-bff61711b9fb" Mar 17 17:30:38.850809 containerd[1538]: time="2025-03-17T17:30:38.850770068Z" level=error msg="Failed to destroy network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.851369 containerd[1538]: time="2025-03-17T17:30:38.851331829Z" level=error msg="encountered an error cleaning up failed sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.851443 containerd[1538]: time="2025-03-17T17:30:38.851409430Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.851670 kubelet[2762]: E0317 17:30:38.851618 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.851725 kubelet[2762]: E0317 17:30:38.851678 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:38.851725 kubelet[2762]: E0317 17:30:38.851698 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:38.851773 kubelet[2762]: E0317 17:30:38.851735 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" podUID="2e8bd960-5c0f-417d-9ed7-865fc7dd73d5" Mar 17 17:30:38.857912 containerd[1538]: time="2025-03-17T17:30:38.857843846Z" level=error msg="encountered an error cleaning up failed sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.858203 containerd[1538]: time="2025-03-17T17:30:38.858174487Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.861669 kubelet[2762]: E0317 17:30:38.861620 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.861775 kubelet[2762]: E0317 17:30:38.861681 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:38.861775 kubelet[2762]: E0317 17:30:38.861701 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:38.861775 kubelet[2762]: E0317 17:30:38.861746 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" podUID="dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d" Mar 17 17:30:38.871070 containerd[1538]: time="2025-03-17T17:30:38.870983399Z" level=error msg="Failed to destroy network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.871363 containerd[1538]: time="2025-03-17T17:30:38.871335560Z" level=error msg="encountered an error cleaning up failed sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.871412 containerd[1538]: time="2025-03-17T17:30:38.871392760Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.871476 containerd[1538]: time="2025-03-17T17:30:38.871405840Z" level=error msg="Failed to destroy network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.871679 kubelet[2762]: E0317 17:30:38.871633 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.871727 kubelet[2762]: E0317 17:30:38.871701 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:38.871758 containerd[1538]: time="2025-03-17T17:30:38.871684441Z" level=error msg="encountered an error cleaning up failed sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.871758 containerd[1538]: time="2025-03-17T17:30:38.871718881Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.871884 kubelet[2762]: E0317 17:30:38.871728 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:38.871884 kubelet[2762]: E0317 17:30:38.871770 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" podUID="511a8295-1573-4289-aeb3-f68df5d31b5a" Mar 17 17:30:38.872709 kubelet[2762]: E0317 17:30:38.871927 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.872709 kubelet[2762]: E0317 17:30:38.871989 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:38.872709 kubelet[2762]: E0317 17:30:38.872010 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:38.872818 kubelet[2762]: E0317 17:30:38.872051 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:38.877103 containerd[1538]: time="2025-03-17T17:30:38.877052094Z" level=error msg="Failed to destroy network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.877540 containerd[1538]: time="2025-03-17T17:30:38.877502095Z" level=error msg="encountered an error cleaning up failed sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.877613 containerd[1538]: time="2025-03-17T17:30:38.877572896Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.877822 kubelet[2762]: E0317 17:30:38.877783 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:38.878065 kubelet[2762]: E0317 17:30:38.877936 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:38.878065 kubelet[2762]: E0317 17:30:38.877960 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:38.878065 kubelet[2762]: E0317 17:30:38.878004 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8tsrg" podUID="abd10042-888c-4822-b83a-6040f4449647" Mar 17 17:30:39.170124 systemd[1]: run-netns-cni\x2df5199525\x2d7306\x2dbc83\x2d9e44\x2dc4e82b042659.mount: Deactivated successfully. Mar 17 17:30:39.171337 systemd[1]: run-netns-cni\x2d41078782\x2db16c\x2d5bd7\x2d8e09\x2d6b5216dab79a.mount: Deactivated successfully. Mar 17 17:30:39.328986 kubelet[2762]: I0317 17:30:39.328953 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5" Mar 17 17:30:39.329507 containerd[1538]: time="2025-03-17T17:30:39.329452651Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\"" Mar 17 17:30:39.331171 containerd[1538]: time="2025-03-17T17:30:39.329621612Z" level=info msg="Ensure that sandbox 463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5 in task-service has been cleanup successfully" Mar 17 17:30:39.331171 containerd[1538]: time="2025-03-17T17:30:39.330228253Z" level=info msg="TearDown network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" successfully" Mar 17 17:30:39.331171 containerd[1538]: time="2025-03-17T17:30:39.330253773Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" returns successfully" Mar 17 17:30:39.331503 containerd[1538]: time="2025-03-17T17:30:39.331477736Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" Mar 17 17:30:39.331954 containerd[1538]: time="2025-03-17T17:30:39.331934977Z" level=info msg="TearDown network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" successfully" Mar 17 17:30:39.332034 containerd[1538]: time="2025-03-17T17:30:39.332019537Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" returns successfully" Mar 17 17:30:39.332450 containerd[1538]: time="2025-03-17T17:30:39.332424618Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:30:39.332618 containerd[1538]: time="2025-03-17T17:30:39.332595779Z" level=info msg="TearDown network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" successfully" Mar 17 17:30:39.332654 containerd[1538]: time="2025-03-17T17:30:39.332621539Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" returns successfully" Mar 17 17:30:39.332667 systemd[1]: run-netns-cni\x2dfdc2ac62\x2d1e43\x2d9744\x2d0a28\x2d1a18f1d74f8b.mount: Deactivated successfully. Mar 17 17:30:39.333325 containerd[1538]: time="2025-03-17T17:30:39.333300781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:3,}" Mar 17 17:30:39.334245 kubelet[2762]: I0317 17:30:39.333794 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134" Mar 17 17:30:39.335654 containerd[1538]: time="2025-03-17T17:30:39.335630226Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\"" Mar 17 17:30:39.335951 containerd[1538]: time="2025-03-17T17:30:39.335926467Z" level=info msg="Ensure that sandbox 33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134 in task-service has been cleanup successfully" Mar 17 17:30:39.336313 containerd[1538]: time="2025-03-17T17:30:39.336290428Z" level=info msg="TearDown network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" successfully" Mar 17 17:30:39.336410 containerd[1538]: time="2025-03-17T17:30:39.336396468Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" returns successfully" Mar 17 17:30:39.337260 containerd[1538]: time="2025-03-17T17:30:39.337230270Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" Mar 17 17:30:39.337325 containerd[1538]: time="2025-03-17T17:30:39.337310550Z" level=info msg="TearDown network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" successfully" Mar 17 17:30:39.337325 containerd[1538]: time="2025-03-17T17:30:39.337320990Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" returns successfully" Mar 17 17:30:39.339529 containerd[1538]: time="2025-03-17T17:30:39.338214113Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:30:39.339529 containerd[1538]: time="2025-03-17T17:30:39.338309753Z" level=info msg="TearDown network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" successfully" Mar 17 17:30:39.339529 containerd[1538]: time="2025-03-17T17:30:39.338320473Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" returns successfully" Mar 17 17:30:39.339529 containerd[1538]: time="2025-03-17T17:30:39.339149195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:3,}" Mar 17 17:30:39.339803 kubelet[2762]: I0317 17:30:39.338507 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2" Mar 17 17:30:39.339250 systemd[1]: run-netns-cni\x2d059898eb\x2d7fac\x2dca5e\x2d5f9d\x2d68f89bb16cfc.mount: Deactivated successfully. Mar 17 17:30:39.341190 containerd[1538]: time="2025-03-17T17:30:39.341079920Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\"" Mar 17 17:30:39.341909 containerd[1538]: time="2025-03-17T17:30:39.341305240Z" level=info msg="Ensure that sandbox d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2 in task-service has been cleanup successfully" Mar 17 17:30:39.343185 containerd[1538]: time="2025-03-17T17:30:39.343027564Z" level=info msg="TearDown network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" successfully" Mar 17 17:30:39.343185 containerd[1538]: time="2025-03-17T17:30:39.343054884Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" returns successfully" Mar 17 17:30:39.344076 systemd[1]: run-netns-cni\x2d59c2027f\x2db4e9\x2da841\x2dc703\x2df44ed4955b0c.mount: Deactivated successfully. Mar 17 17:30:39.344669 kubelet[2762]: I0317 17:30:39.344512 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.344797369Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.344917369Z" level=info msg="TearDown network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" successfully" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.344929169Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" returns successfully" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.345272210Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.345378770Z" level=info msg="TearDown network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" successfully" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.345391890Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" returns successfully" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.345478250Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\"" Mar 17 17:30:39.345824 containerd[1538]: time="2025-03-17T17:30:39.345805531Z" level=info msg="Ensure that sandbox cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31 in task-service has been cleanup successfully" Mar 17 17:30:39.346063 kubelet[2762]: E0317 17:30:39.345600 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:39.346096 containerd[1538]: time="2025-03-17T17:30:39.345831011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:3,}" Mar 17 17:30:39.346096 containerd[1538]: time="2025-03-17T17:30:39.346043212Z" level=info msg="TearDown network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" successfully" Mar 17 17:30:39.346096 containerd[1538]: time="2025-03-17T17:30:39.346059452Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" returns successfully" Mar 17 17:30:39.346493 containerd[1538]: time="2025-03-17T17:30:39.346465573Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" Mar 17 17:30:39.346562 containerd[1538]: time="2025-03-17T17:30:39.346543453Z" level=info msg="TearDown network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" successfully" Mar 17 17:30:39.346562 containerd[1538]: time="2025-03-17T17:30:39.346560093Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" returns successfully" Mar 17 17:30:39.346771 kubelet[2762]: I0317 17:30:39.346745 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481" Mar 17 17:30:39.347372 containerd[1538]: time="2025-03-17T17:30:39.347341095Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:30:39.347432 containerd[1538]: time="2025-03-17T17:30:39.347420615Z" level=info msg="TearDown network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" successfully" Mar 17 17:30:39.347432 containerd[1538]: time="2025-03-17T17:30:39.347430055Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" returns successfully" Mar 17 17:30:39.347478 containerd[1538]: time="2025-03-17T17:30:39.347453655Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\"" Mar 17 17:30:39.348975 containerd[1538]: time="2025-03-17T17:30:39.347582135Z" level=info msg="Ensure that sandbox 36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481 in task-service has been cleanup successfully" Mar 17 17:30:39.348975 containerd[1538]: time="2025-03-17T17:30:39.348306297Z" level=info msg="TearDown network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" successfully" Mar 17 17:30:39.348975 containerd[1538]: time="2025-03-17T17:30:39.348320537Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" returns successfully" Mar 17 17:30:39.348975 containerd[1538]: time="2025-03-17T17:30:39.348670378Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" Mar 17 17:30:39.348975 containerd[1538]: time="2025-03-17T17:30:39.348748618Z" level=info msg="TearDown network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" successfully" Mar 17 17:30:39.348975 containerd[1538]: time="2025-03-17T17:30:39.348757218Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" returns successfully" Mar 17 17:30:39.348507 systemd[1]: run-netns-cni\x2d2519a3e6\x2dae71\x2ddc79\x2d7cc9\x2da1f915db0580.mount: Deactivated successfully. Mar 17 17:30:39.349207 containerd[1538]: time="2025-03-17T17:30:39.349139899Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:30:39.349230 containerd[1538]: time="2025-03-17T17:30:39.349208299Z" level=info msg="TearDown network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" successfully" Mar 17 17:30:39.349230 containerd[1538]: time="2025-03-17T17:30:39.349217139Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" returns successfully" Mar 17 17:30:39.349504 kubelet[2762]: E0317 17:30:39.349465 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:39.349800 containerd[1538]: time="2025-03-17T17:30:39.349775421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:3,}" Mar 17 17:30:39.350072 kubelet[2762]: I0317 17:30:39.350048 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c" Mar 17 17:30:39.350728 containerd[1538]: time="2025-03-17T17:30:39.350704183Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\"" Mar 17 17:30:39.351071 containerd[1538]: time="2025-03-17T17:30:39.351014904Z" level=info msg="Ensure that sandbox 253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c in task-service has been cleanup successfully" Mar 17 17:30:39.351199 containerd[1538]: time="2025-03-17T17:30:39.351171464Z" level=info msg="TearDown network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" successfully" Mar 17 17:30:39.351199 containerd[1538]: time="2025-03-17T17:30:39.351188544Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" returns successfully" Mar 17 17:30:39.351542 containerd[1538]: time="2025-03-17T17:30:39.351518865Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" Mar 17 17:30:39.351624 containerd[1538]: time="2025-03-17T17:30:39.351594705Z" level=info msg="TearDown network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" successfully" Mar 17 17:30:39.351624 containerd[1538]: time="2025-03-17T17:30:39.351607505Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" returns successfully" Mar 17 17:30:39.351996 containerd[1538]: time="2025-03-17T17:30:39.351977026Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:30:39.352399 containerd[1538]: time="2025-03-17T17:30:39.352373787Z" level=info msg="TearDown network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" successfully" Mar 17 17:30:39.352448 containerd[1538]: time="2025-03-17T17:30:39.352397747Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" returns successfully" Mar 17 17:30:39.352979 containerd[1538]: time="2025-03-17T17:30:39.352956629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:30:39.423396 containerd[1538]: time="2025-03-17T17:30:39.423275161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:30:39.610629 containerd[1538]: time="2025-03-17T17:30:39.610576899Z" level=error msg="Failed to destroy network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.611832 containerd[1538]: time="2025-03-17T17:30:39.611673862Z" level=error msg="encountered an error cleaning up failed sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.611832 containerd[1538]: time="2025-03-17T17:30:39.611737702Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.612034 kubelet[2762]: E0317 17:30:39.611980 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.612117 kubelet[2762]: E0317 17:30:39.612060 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:39.612117 kubelet[2762]: E0317 17:30:39.612083 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:39.612177 kubelet[2762]: E0317 17:30:39.612143 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:39.614931 containerd[1538]: time="2025-03-17T17:30:39.614841190Z" level=error msg="Failed to destroy network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.615311 containerd[1538]: time="2025-03-17T17:30:39.615277951Z" level=error msg="encountered an error cleaning up failed sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.615373 containerd[1538]: time="2025-03-17T17:30:39.615342511Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.615782 kubelet[2762]: E0317 17:30:39.615605 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.615782 kubelet[2762]: E0317 17:30:39.615678 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:39.615782 kubelet[2762]: E0317 17:30:39.615696 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:39.615921 kubelet[2762]: E0317 17:30:39.615740 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" podUID="2e8bd960-5c0f-417d-9ed7-865fc7dd73d5" Mar 17 17:30:39.616443 containerd[1538]: time="2025-03-17T17:30:39.616400634Z" level=error msg="Failed to destroy network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.616753 containerd[1538]: time="2025-03-17T17:30:39.616720474Z" level=error msg="encountered an error cleaning up failed sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.616816 containerd[1538]: time="2025-03-17T17:30:39.616774755Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.616954 kubelet[2762]: E0317 17:30:39.616926 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.617043 kubelet[2762]: E0317 17:30:39.616966 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:39.617043 kubelet[2762]: E0317 17:30:39.616983 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:39.617043 kubelet[2762]: E0317 17:30:39.617011 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zx22h" podUID="ff9f13ce-0246-4ffb-9085-bff61711b9fb" Mar 17 17:30:39.634114 containerd[1538]: time="2025-03-17T17:30:39.633997077Z" level=error msg="Failed to destroy network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.634442 containerd[1538]: time="2025-03-17T17:30:39.634412558Z" level=error msg="encountered an error cleaning up failed sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.634515 containerd[1538]: time="2025-03-17T17:30:39.634479398Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.634736 kubelet[2762]: E0317 17:30:39.634684 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.635024 kubelet[2762]: E0317 17:30:39.634824 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:39.635024 kubelet[2762]: E0317 17:30:39.634852 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:39.635024 kubelet[2762]: E0317 17:30:39.634915 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" podUID="dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d" Mar 17 17:30:39.641084 containerd[1538]: time="2025-03-17T17:30:39.641046654Z" level=error msg="Failed to destroy network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.641483 containerd[1538]: time="2025-03-17T17:30:39.641457615Z" level=error msg="encountered an error cleaning up failed sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.641597 containerd[1538]: time="2025-03-17T17:30:39.641575655Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.641888 kubelet[2762]: E0317 17:30:39.641836 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.641965 kubelet[2762]: E0317 17:30:39.641903 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:39.641965 kubelet[2762]: E0317 17:30:39.641924 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:39.642015 kubelet[2762]: E0317 17:30:39.641962 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" podUID="511a8295-1573-4289-aeb3-f68df5d31b5a" Mar 17 17:30:39.644735 containerd[1538]: time="2025-03-17T17:30:39.644706103Z" level=error msg="Failed to destroy network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.645034 containerd[1538]: time="2025-03-17T17:30:39.645010024Z" level=error msg="encountered an error cleaning up failed sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.645085 containerd[1538]: time="2025-03-17T17:30:39.645056704Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.645263 kubelet[2762]: E0317 17:30:39.645237 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:39.645298 kubelet[2762]: E0317 17:30:39.645276 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:39.645331 kubelet[2762]: E0317 17:30:39.645296 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:39.645356 kubelet[2762]: E0317 17:30:39.645329 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8tsrg" podUID="abd10042-888c-4822-b83a-6040f4449647" Mar 17 17:30:40.169143 systemd[1]: run-netns-cni\x2d2f80bb51\x2d5782\x2dfbf2\x2d55c1\x2d82449b3584a2.mount: Deactivated successfully. Mar 17 17:30:40.169277 systemd[1]: run-netns-cni\x2de12891e1\x2dae7d\x2d58f7\x2d2e3d\x2d99c67f75b8ff.mount: Deactivated successfully. Mar 17 17:30:40.353991 kubelet[2762]: I0317 17:30:40.353961 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9" Mar 17 17:30:40.358126 kubelet[2762]: I0317 17:30:40.358094 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c" Mar 17 17:30:40.358448 containerd[1538]: time="2025-03-17T17:30:40.358408785Z" level=info msg="StopPodSandbox for \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\"" Mar 17 17:30:40.360025 containerd[1538]: time="2025-03-17T17:30:40.358579426Z" level=info msg="Ensure that sandbox d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9 in task-service has been cleanup successfully" Mar 17 17:30:40.360025 containerd[1538]: time="2025-03-17T17:30:40.358748626Z" level=info msg="TearDown network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" successfully" Mar 17 17:30:40.360025 containerd[1538]: time="2025-03-17T17:30:40.358764866Z" level=info msg="StopPodSandbox for \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" returns successfully" Mar 17 17:30:40.360025 containerd[1538]: time="2025-03-17T17:30:40.358893866Z" level=info msg="StopPodSandbox for \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\"" Mar 17 17:30:40.360025 containerd[1538]: time="2025-03-17T17:30:40.359069387Z" level=info msg="Ensure that sandbox 5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c in task-service has been cleanup successfully" Mar 17 17:30:40.360472 containerd[1538]: time="2025-03-17T17:30:40.360250870Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\"" Mar 17 17:30:40.360472 containerd[1538]: time="2025-03-17T17:30:40.360339590Z" level=info msg="TearDown network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" successfully" Mar 17 17:30:40.360472 containerd[1538]: time="2025-03-17T17:30:40.360350950Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" returns successfully" Mar 17 17:30:40.361393 containerd[1538]: time="2025-03-17T17:30:40.361290152Z" level=info msg="TearDown network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" successfully" Mar 17 17:30:40.361696 containerd[1538]: time="2025-03-17T17:30:40.361320912Z" level=info msg="StopPodSandbox for \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" returns successfully" Mar 17 17:30:40.361760 containerd[1538]: time="2025-03-17T17:30:40.361590433Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" Mar 17 17:30:40.361830 containerd[1538]: time="2025-03-17T17:30:40.361814313Z" level=info msg="TearDown network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" successfully" Mar 17 17:30:40.361992 containerd[1538]: time="2025-03-17T17:30:40.361942314Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" returns successfully" Mar 17 17:30:40.362319 containerd[1538]: time="2025-03-17T17:30:40.362283515Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\"" Mar 17 17:30:40.362411 containerd[1538]: time="2025-03-17T17:30:40.362363355Z" level=info msg="TearDown network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" successfully" Mar 17 17:30:40.362411 containerd[1538]: time="2025-03-17T17:30:40.362373795Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" returns successfully" Mar 17 17:30:40.362468 containerd[1538]: time="2025-03-17T17:30:40.362295115Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:30:40.362491 containerd[1538]: time="2025-03-17T17:30:40.362465595Z" level=info msg="TearDown network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" successfully" Mar 17 17:30:40.362801 containerd[1538]: time="2025-03-17T17:30:40.362475155Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" returns successfully" Mar 17 17:30:40.363741 systemd[1]: run-netns-cni\x2debaaf483\x2d11bc\x2d3952\x2d58f6\x2d8ca8b0467b56.mount: Deactivated successfully. Mar 17 17:30:40.363942 systemd[1]: run-netns-cni\x2d1c16138a\x2de164\x2de5e2\x2d9bea\x2df65c5c2a6063.mount: Deactivated successfully. Mar 17 17:30:40.364077 containerd[1538]: time="2025-03-17T17:30:40.363826318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:4,}" Mar 17 17:30:40.364077 containerd[1538]: time="2025-03-17T17:30:40.363947079Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" Mar 17 17:30:40.364077 containerd[1538]: time="2025-03-17T17:30:40.364056159Z" level=info msg="TearDown network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" successfully" Mar 17 17:30:40.364077 containerd[1538]: time="2025-03-17T17:30:40.364066919Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" returns successfully" Mar 17 17:30:40.364621 containerd[1538]: time="2025-03-17T17:30:40.364591320Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:30:40.364690 containerd[1538]: time="2025-03-17T17:30:40.364672760Z" level=info msg="TearDown network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" successfully" Mar 17 17:30:40.364719 containerd[1538]: time="2025-03-17T17:30:40.364688240Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" returns successfully" Mar 17 17:30:40.365267 containerd[1538]: time="2025-03-17T17:30:40.365231202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:4,}" Mar 17 17:30:40.366066 kubelet[2762]: I0317 17:30:40.365497 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb" Mar 17 17:30:40.366154 containerd[1538]: time="2025-03-17T17:30:40.365920323Z" level=info msg="StopPodSandbox for \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\"" Mar 17 17:30:40.368040 containerd[1538]: time="2025-03-17T17:30:40.367009966Z" level=info msg="Ensure that sandbox a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb in task-service has been cleanup successfully" Mar 17 17:30:40.368347 containerd[1538]: time="2025-03-17T17:30:40.368321089Z" level=info msg="TearDown network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" successfully" Mar 17 17:30:40.368347 containerd[1538]: time="2025-03-17T17:30:40.368345129Z" level=info msg="StopPodSandbox for \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" returns successfully" Mar 17 17:30:40.368662 containerd[1538]: time="2025-03-17T17:30:40.368620850Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\"" Mar 17 17:30:40.368736 containerd[1538]: time="2025-03-17T17:30:40.368711490Z" level=info msg="TearDown network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" successfully" Mar 17 17:30:40.368736 containerd[1538]: time="2025-03-17T17:30:40.368728090Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" returns successfully" Mar 17 17:30:40.369602 containerd[1538]: time="2025-03-17T17:30:40.369471572Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" Mar 17 17:30:40.369602 containerd[1538]: time="2025-03-17T17:30:40.369561892Z" level=info msg="TearDown network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" successfully" Mar 17 17:30:40.369602 containerd[1538]: time="2025-03-17T17:30:40.369571812Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" returns successfully" Mar 17 17:30:40.371724 containerd[1538]: time="2025-03-17T17:30:40.370629654Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:30:40.371724 containerd[1538]: time="2025-03-17T17:30:40.370714455Z" level=info msg="TearDown network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" successfully" Mar 17 17:30:40.371724 containerd[1538]: time="2025-03-17T17:30:40.370723695Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" returns successfully" Mar 17 17:30:40.371364 systemd[1]: run-netns-cni\x2d70191d52\x2ddd42\x2de2bb\x2d4a49\x2d58da842b6682.mount: Deactivated successfully. Mar 17 17:30:40.371929 kubelet[2762]: I0317 17:30:40.371173 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1" Mar 17 17:30:40.372247 containerd[1538]: time="2025-03-17T17:30:40.372206098Z" level=info msg="StopPodSandbox for \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\"" Mar 17 17:30:40.372501 containerd[1538]: time="2025-03-17T17:30:40.372434459Z" level=info msg="Ensure that sandbox 94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1 in task-service has been cleanup successfully" Mar 17 17:30:40.372547 containerd[1538]: time="2025-03-17T17:30:40.372516019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:30:40.372661 containerd[1538]: time="2025-03-17T17:30:40.372626939Z" level=info msg="TearDown network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" successfully" Mar 17 17:30:40.372695 containerd[1538]: time="2025-03-17T17:30:40.372661619Z" level=info msg="StopPodSandbox for \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" returns successfully" Mar 17 17:30:40.374433 systemd[1]: run-netns-cni\x2d7bfbca2b\x2d34aa\x2dae84\x2ddce4\x2d9f88b6891b63.mount: Deactivated successfully. Mar 17 17:30:40.375060 containerd[1538]: time="2025-03-17T17:30:40.374853384Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\"" Mar 17 17:30:40.375060 containerd[1538]: time="2025-03-17T17:30:40.374960025Z" level=info msg="TearDown network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" successfully" Mar 17 17:30:40.375060 containerd[1538]: time="2025-03-17T17:30:40.374972185Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" returns successfully" Mar 17 17:30:40.375876 containerd[1538]: time="2025-03-17T17:30:40.375788707Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" Mar 17 17:30:40.375876 containerd[1538]: time="2025-03-17T17:30:40.375876187Z" level=info msg="TearDown network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" successfully" Mar 17 17:30:40.375973 containerd[1538]: time="2025-03-17T17:30:40.375888627Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" returns successfully" Mar 17 17:30:40.376559 containerd[1538]: time="2025-03-17T17:30:40.376408788Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:30:40.376559 containerd[1538]: time="2025-03-17T17:30:40.376497268Z" level=info msg="TearDown network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" successfully" Mar 17 17:30:40.376559 containerd[1538]: time="2025-03-17T17:30:40.376506988Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" returns successfully" Mar 17 17:30:40.377276 kubelet[2762]: E0317 17:30:40.377119 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:40.377488 containerd[1538]: time="2025-03-17T17:30:40.377428791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:4,}" Mar 17 17:30:40.377995 kubelet[2762]: I0317 17:30:40.377648 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7" Mar 17 17:30:40.378146 containerd[1538]: time="2025-03-17T17:30:40.378117472Z" level=info msg="StopPodSandbox for \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\"" Mar 17 17:30:40.378293 containerd[1538]: time="2025-03-17T17:30:40.378268953Z" level=info msg="Ensure that sandbox af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7 in task-service has been cleanup successfully" Mar 17 17:30:40.378533 containerd[1538]: time="2025-03-17T17:30:40.378462273Z" level=info msg="TearDown network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" successfully" Mar 17 17:30:40.378533 containerd[1538]: time="2025-03-17T17:30:40.378481313Z" level=info msg="StopPodSandbox for \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" returns successfully" Mar 17 17:30:40.379203 containerd[1538]: time="2025-03-17T17:30:40.379009154Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\"" Mar 17 17:30:40.379203 containerd[1538]: time="2025-03-17T17:30:40.379088835Z" level=info msg="TearDown network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" successfully" Mar 17 17:30:40.379203 containerd[1538]: time="2025-03-17T17:30:40.379099675Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" returns successfully" Mar 17 17:30:40.379424 containerd[1538]: time="2025-03-17T17:30:40.379401355Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" Mar 17 17:30:40.379493 containerd[1538]: time="2025-03-17T17:30:40.379476715Z" level=info msg="TearDown network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" successfully" Mar 17 17:30:40.379493 containerd[1538]: time="2025-03-17T17:30:40.379489715Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" returns successfully" Mar 17 17:30:40.379890 containerd[1538]: time="2025-03-17T17:30:40.379688996Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:30:40.379890 containerd[1538]: time="2025-03-17T17:30:40.379755476Z" level=info msg="TearDown network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" successfully" Mar 17 17:30:40.379890 containerd[1538]: time="2025-03-17T17:30:40.379765556Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" returns successfully" Mar 17 17:30:40.380590 containerd[1538]: time="2025-03-17T17:30:40.380443798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:30:40.380785 kubelet[2762]: I0317 17:30:40.380699 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921" Mar 17 17:30:40.381179 containerd[1538]: time="2025-03-17T17:30:40.381151159Z" level=info msg="StopPodSandbox for \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\"" Mar 17 17:30:40.381310 containerd[1538]: time="2025-03-17T17:30:40.381291560Z" level=info msg="Ensure that sandbox 5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921 in task-service has been cleanup successfully" Mar 17 17:30:40.381598 containerd[1538]: time="2025-03-17T17:30:40.381570920Z" level=info msg="TearDown network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" successfully" Mar 17 17:30:40.381598 containerd[1538]: time="2025-03-17T17:30:40.381590240Z" level=info msg="StopPodSandbox for \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" returns successfully" Mar 17 17:30:40.382071 containerd[1538]: time="2025-03-17T17:30:40.381970921Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\"" Mar 17 17:30:40.382071 containerd[1538]: time="2025-03-17T17:30:40.382053882Z" level=info msg="TearDown network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" successfully" Mar 17 17:30:40.382071 containerd[1538]: time="2025-03-17T17:30:40.382064722Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" returns successfully" Mar 17 17:30:40.382561 containerd[1538]: time="2025-03-17T17:30:40.382419642Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" Mar 17 17:30:40.382561 containerd[1538]: time="2025-03-17T17:30:40.382483563Z" level=info msg="TearDown network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" successfully" Mar 17 17:30:40.382561 containerd[1538]: time="2025-03-17T17:30:40.382492403Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" returns successfully" Mar 17 17:30:40.382849 containerd[1538]: time="2025-03-17T17:30:40.382760523Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:30:40.383152 containerd[1538]: time="2025-03-17T17:30:40.383126004Z" level=info msg="TearDown network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" successfully" Mar 17 17:30:40.383152 containerd[1538]: time="2025-03-17T17:30:40.383144884Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" returns successfully" Mar 17 17:30:40.383365 kubelet[2762]: E0317 17:30:40.383344 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:40.383799 containerd[1538]: time="2025-03-17T17:30:40.383702486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:4,}" Mar 17 17:30:40.708601 containerd[1538]: time="2025-03-17T17:30:40.708523298Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 17 17:30:40.714891 containerd[1538]: time="2025-03-17T17:30:40.714837153Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 4.428620511s" Mar 17 17:30:40.714891 containerd[1538]: time="2025-03-17T17:30:40.714889313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 17 17:30:40.725636 containerd[1538]: time="2025-03-17T17:30:40.725594979Z" level=info msg="CreateContainer within sandbox \"973eb51fe8ba3ea79aa48220786a579c75ccd065878c01a9dd9ecd0fb1d93625\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:30:40.735706 containerd[1538]: time="2025-03-17T17:30:40.735661483Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:40.741933 containerd[1538]: time="2025-03-17T17:30:40.741889337Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:40.742531 containerd[1538]: time="2025-03-17T17:30:40.742506339Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:40.752590 containerd[1538]: time="2025-03-17T17:30:40.752521803Z" level=info msg="CreateContainer within sandbox \"973eb51fe8ba3ea79aa48220786a579c75ccd065878c01a9dd9ecd0fb1d93625\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3fbba7feb2fa3be2717e5323c0f5ada47920147c4745391ddf0701acbb717c28\"" Mar 17 17:30:40.753741 containerd[1538]: time="2025-03-17T17:30:40.753712486Z" level=info msg="StartContainer for \"3fbba7feb2fa3be2717e5323c0f5ada47920147c4745391ddf0701acbb717c28\"" Mar 17 17:30:40.805140 containerd[1538]: time="2025-03-17T17:30:40.805066728Z" level=error msg="Failed to destroy network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.806946 containerd[1538]: time="2025-03-17T17:30:40.806803372Z" level=error msg="encountered an error cleaning up failed sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.806946 containerd[1538]: time="2025-03-17T17:30:40.806892252Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.807332 containerd[1538]: time="2025-03-17T17:30:40.807137613Z" level=error msg="Failed to destroy network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.807361 kubelet[2762]: E0317 17:30:40.807133 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.807361 kubelet[2762]: E0317 17:30:40.807197 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:40.807361 kubelet[2762]: E0317 17:30:40.807219 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ddmk9" Mar 17 17:30:40.807517 kubelet[2762]: E0317 17:30:40.807327 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ddmk9_calico-system(191fd0a9-26f3-46f0-864d-1b5b729dbb52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ddmk9" podUID="191fd0a9-26f3-46f0-864d-1b5b729dbb52" Mar 17 17:30:40.807984 containerd[1538]: time="2025-03-17T17:30:40.807800574Z" level=error msg="encountered an error cleaning up failed sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.807984 containerd[1538]: time="2025-03-17T17:30:40.807853094Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.809599 kubelet[2762]: E0317 17:30:40.809457 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.809599 kubelet[2762]: E0317 17:30:40.809502 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:40.809599 kubelet[2762]: E0317 17:30:40.809519 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8tsrg" Mar 17 17:30:40.809745 kubelet[2762]: E0317 17:30:40.809552 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8tsrg_kube-system(abd10042-888c-4822-b83a-6040f4449647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8tsrg" podUID="abd10042-888c-4822-b83a-6040f4449647" Mar 17 17:30:40.812279 containerd[1538]: time="2025-03-17T17:30:40.812136185Z" level=error msg="Failed to destroy network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.813137 containerd[1538]: time="2025-03-17T17:30:40.813084907Z" level=error msg="encountered an error cleaning up failed sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.813195 containerd[1538]: time="2025-03-17T17:30:40.813160827Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.813401 kubelet[2762]: E0317 17:30:40.813372 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.813459 kubelet[2762]: E0317 17:30:40.813419 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:40.813459 kubelet[2762]: E0317 17:30:40.813437 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" Mar 17 17:30:40.814092 kubelet[2762]: E0317 17:30:40.814050 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-8n7jx_calico-apiserver(dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" podUID="dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d" Mar 17 17:30:40.819809 containerd[1538]: time="2025-03-17T17:30:40.819737883Z" level=error msg="Failed to destroy network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.820175 containerd[1538]: time="2025-03-17T17:30:40.820142924Z" level=error msg="encountered an error cleaning up failed sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.820226 containerd[1538]: time="2025-03-17T17:30:40.820205444Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.820768 kubelet[2762]: E0317 17:30:40.820440 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.820768 kubelet[2762]: E0317 17:30:40.820501 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:40.820768 kubelet[2762]: E0317 17:30:40.820519 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" Mar 17 17:30:40.820888 kubelet[2762]: E0317 17:30:40.820557 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-586bfd9c56-zr9qf_calico-system(2e8bd960-5c0f-417d-9ed7-865fc7dd73d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" podUID="2e8bd960-5c0f-417d-9ed7-865fc7dd73d5" Mar 17 17:30:40.825052 containerd[1538]: time="2025-03-17T17:30:40.824999175Z" level=error msg="Failed to destroy network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.828694 containerd[1538]: time="2025-03-17T17:30:40.826254058Z" level=error msg="encountered an error cleaning up failed sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.828694 containerd[1538]: time="2025-03-17T17:30:40.826314338Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.828875 kubelet[2762]: E0317 17:30:40.826599 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.828875 kubelet[2762]: E0317 17:30:40.826651 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:40.828875 kubelet[2762]: E0317 17:30:40.826679 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" Mar 17 17:30:40.828988 kubelet[2762]: E0317 17:30:40.826768 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-57698566f9-jhj7p_calico-apiserver(511a8295-1573-4289-aeb3-f68df5d31b5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" podUID="511a8295-1573-4289-aeb3-f68df5d31b5a" Mar 17 17:30:40.833718 containerd[1538]: time="2025-03-17T17:30:40.833677436Z" level=error msg="Failed to destroy network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.834057 containerd[1538]: time="2025-03-17T17:30:40.834028357Z" level=error msg="encountered an error cleaning up failed sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.834153 containerd[1538]: time="2025-03-17T17:30:40.834095357Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.834396 kubelet[2762]: E0317 17:30:40.834361 2762 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:30:40.834632 kubelet[2762]: E0317 17:30:40.834515 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:40.834632 kubelet[2762]: E0317 17:30:40.834559 2762 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-zx22h" Mar 17 17:30:40.834632 kubelet[2762]: E0317 17:30:40.834605 2762 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-zx22h_kube-system(ff9f13ce-0246-4ffb-9085-bff61711b9fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-zx22h" podUID="ff9f13ce-0246-4ffb-9085-bff61711b9fb" Mar 17 17:30:40.866355 containerd[1538]: time="2025-03-17T17:30:40.866235473Z" level=info msg="StartContainer for \"3fbba7feb2fa3be2717e5323c0f5ada47920147c4745391ddf0701acbb717c28\" returns successfully" Mar 17 17:30:41.044538 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 17:30:41.044642 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 17:30:41.172983 systemd[1]: run-netns-cni\x2d30f0b7bd\x2d7c68\x2d21f7\x2dec67\x2d20fb3ee46d98.mount: Deactivated successfully. Mar 17 17:30:41.173140 systemd[1]: run-netns-cni\x2d69e30686\x2d8357\x2d2fa1\x2d86bd\x2d298e20c837b5.mount: Deactivated successfully. Mar 17 17:30:41.173223 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount102048507.mount: Deactivated successfully. Mar 17 17:30:41.386896 kubelet[2762]: I0317 17:30:41.386687 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7" Mar 17 17:30:41.387633 containerd[1538]: time="2025-03-17T17:30:41.387593888Z" level=info msg="StopPodSandbox for \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\"" Mar 17 17:30:41.387916 containerd[1538]: time="2025-03-17T17:30:41.387776409Z" level=info msg="Ensure that sandbox 17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7 in task-service has been cleanup successfully" Mar 17 17:30:41.389490 containerd[1538]: time="2025-03-17T17:30:41.388047089Z" level=info msg="TearDown network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\" successfully" Mar 17 17:30:41.389490 containerd[1538]: time="2025-03-17T17:30:41.388063649Z" level=info msg="StopPodSandbox for \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\" returns successfully" Mar 17 17:30:41.389490 containerd[1538]: time="2025-03-17T17:30:41.389343332Z" level=info msg="StopPodSandbox for \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\"" Mar 17 17:30:41.389709 containerd[1538]: time="2025-03-17T17:30:41.389647453Z" level=info msg="TearDown network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" successfully" Mar 17 17:30:41.389709 containerd[1538]: time="2025-03-17T17:30:41.389664653Z" level=info msg="StopPodSandbox for \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" returns successfully" Mar 17 17:30:41.390249 systemd[1]: run-netns-cni\x2de08b7018\x2df767\x2daaa3\x2da624\x2dae37b0a2f449.mount: Deactivated successfully. Mar 17 17:30:41.391074 containerd[1538]: time="2025-03-17T17:30:41.390278294Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\"" Mar 17 17:30:41.391074 containerd[1538]: time="2025-03-17T17:30:41.390360855Z" level=info msg="TearDown network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" successfully" Mar 17 17:30:41.391074 containerd[1538]: time="2025-03-17T17:30:41.390371575Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" returns successfully" Mar 17 17:30:41.391533 containerd[1538]: time="2025-03-17T17:30:41.391238297Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" Mar 17 17:30:41.391533 containerd[1538]: time="2025-03-17T17:30:41.391338777Z" level=info msg="TearDown network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" successfully" Mar 17 17:30:41.391533 containerd[1538]: time="2025-03-17T17:30:41.391349777Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" returns successfully" Mar 17 17:30:41.392111 containerd[1538]: time="2025-03-17T17:30:41.391903378Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:30:41.392111 containerd[1538]: time="2025-03-17T17:30:41.392056778Z" level=info msg="TearDown network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" successfully" Mar 17 17:30:41.392111 containerd[1538]: time="2025-03-17T17:30:41.392070858Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" returns successfully" Mar 17 17:30:41.392720 kubelet[2762]: I0317 17:30:41.392691 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4" Mar 17 17:30:41.393254 containerd[1538]: time="2025-03-17T17:30:41.392999261Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:5,}" Mar 17 17:30:41.393320 containerd[1538]: time="2025-03-17T17:30:41.393296821Z" level=info msg="StopPodSandbox for \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\"" Mar 17 17:30:41.394749 containerd[1538]: time="2025-03-17T17:30:41.393472542Z" level=info msg="Ensure that sandbox ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4 in task-service has been cleanup successfully" Mar 17 17:30:41.394749 containerd[1538]: time="2025-03-17T17:30:41.393680782Z" level=info msg="TearDown network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\" successfully" Mar 17 17:30:41.394749 containerd[1538]: time="2025-03-17T17:30:41.393695982Z" level=info msg="StopPodSandbox for \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\" returns successfully" Mar 17 17:30:41.395697 systemd[1]: run-netns-cni\x2d2e58834a\x2dbfa8\x2d4668\x2d0231\x2defa28fc41046.mount: Deactivated successfully. Mar 17 17:30:41.396697 containerd[1538]: time="2025-03-17T17:30:41.396242228Z" level=info msg="StopPodSandbox for \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\"" Mar 17 17:30:41.396697 containerd[1538]: time="2025-03-17T17:30:41.396420629Z" level=info msg="TearDown network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" successfully" Mar 17 17:30:41.396697 containerd[1538]: time="2025-03-17T17:30:41.396435309Z" level=info msg="StopPodSandbox for \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" returns successfully" Mar 17 17:30:41.397105 containerd[1538]: time="2025-03-17T17:30:41.397063630Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\"" Mar 17 17:30:41.397767 containerd[1538]: time="2025-03-17T17:30:41.397734752Z" level=info msg="TearDown network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" successfully" Mar 17 17:30:41.397987 containerd[1538]: time="2025-03-17T17:30:41.397944192Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" returns successfully" Mar 17 17:30:41.398928 containerd[1538]: time="2025-03-17T17:30:41.398902914Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" Mar 17 17:30:41.399047 containerd[1538]: time="2025-03-17T17:30:41.399030755Z" level=info msg="TearDown network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" successfully" Mar 17 17:30:41.399072 containerd[1538]: time="2025-03-17T17:30:41.399047395Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" returns successfully" Mar 17 17:30:41.399416 containerd[1538]: time="2025-03-17T17:30:41.399392715Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:30:41.400225 containerd[1538]: time="2025-03-17T17:30:41.399501196Z" level=info msg="TearDown network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" successfully" Mar 17 17:30:41.400225 containerd[1538]: time="2025-03-17T17:30:41.399528116Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" returns successfully" Mar 17 17:30:41.400293 kubelet[2762]: E0317 17:30:41.400253 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:41.400565 containerd[1538]: time="2025-03-17T17:30:41.400526518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:5,}" Mar 17 17:30:41.400671 kubelet[2762]: I0317 17:30:41.400651 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942" Mar 17 17:30:41.404600 containerd[1538]: time="2025-03-17T17:30:41.402346842Z" level=info msg="StopPodSandbox for \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\"" Mar 17 17:30:41.404600 containerd[1538]: time="2025-03-17T17:30:41.402626963Z" level=info msg="Ensure that sandbox 70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942 in task-service has been cleanup successfully" Mar 17 17:30:41.404945 containerd[1538]: time="2025-03-17T17:30:41.404906488Z" level=info msg="TearDown network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\" successfully" Mar 17 17:30:41.405002 containerd[1538]: time="2025-03-17T17:30:41.404941728Z" level=info msg="StopPodSandbox for \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\" returns successfully" Mar 17 17:30:41.405091 systemd[1]: run-netns-cni\x2de14bf5bf\x2d81da\x2ddb7a\x2d650c\x2d01289ffd526a.mount: Deactivated successfully. Mar 17 17:30:41.405687 containerd[1538]: time="2025-03-17T17:30:41.405629130Z" level=info msg="StopPodSandbox for \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\"" Mar 17 17:30:41.406686 containerd[1538]: time="2025-03-17T17:30:41.406583452Z" level=info msg="TearDown network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" successfully" Mar 17 17:30:41.406686 containerd[1538]: time="2025-03-17T17:30:41.406611932Z" level=info msg="StopPodSandbox for \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" returns successfully" Mar 17 17:30:41.408390 kubelet[2762]: I0317 17:30:41.408264 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94" Mar 17 17:30:41.408474 containerd[1538]: time="2025-03-17T17:30:41.407284934Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\"" Mar 17 17:30:41.410006 containerd[1538]: time="2025-03-17T17:30:41.409975300Z" level=info msg="TearDown network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" successfully" Mar 17 17:30:41.410251 containerd[1538]: time="2025-03-17T17:30:41.410077180Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" returns successfully" Mar 17 17:30:41.410251 containerd[1538]: time="2025-03-17T17:30:41.410152220Z" level=info msg="StopPodSandbox for \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\"" Mar 17 17:30:41.410343 containerd[1538]: time="2025-03-17T17:30:41.410313261Z" level=info msg="Ensure that sandbox 4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94 in task-service has been cleanup successfully" Mar 17 17:30:41.410808 containerd[1538]: time="2025-03-17T17:30:41.410587101Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" Mar 17 17:30:41.410808 containerd[1538]: time="2025-03-17T17:30:41.410670821Z" level=info msg="TearDown network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" successfully" Mar 17 17:30:41.410808 containerd[1538]: time="2025-03-17T17:30:41.410681702Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" returns successfully" Mar 17 17:30:41.411368 containerd[1538]: time="2025-03-17T17:30:41.411338943Z" level=info msg="TearDown network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\" successfully" Mar 17 17:30:41.411368 containerd[1538]: time="2025-03-17T17:30:41.411365623Z" level=info msg="StopPodSandbox for \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\" returns successfully" Mar 17 17:30:41.412310 containerd[1538]: time="2025-03-17T17:30:41.411993585Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:30:41.412310 containerd[1538]: time="2025-03-17T17:30:41.412124185Z" level=info msg="TearDown network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" successfully" Mar 17 17:30:41.412310 containerd[1538]: time="2025-03-17T17:30:41.412135705Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" returns successfully" Mar 17 17:30:41.412310 containerd[1538]: time="2025-03-17T17:30:41.412137425Z" level=info msg="StopPodSandbox for \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\"" Mar 17 17:30:41.412310 containerd[1538]: time="2025-03-17T17:30:41.412246705Z" level=info msg="TearDown network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" successfully" Mar 17 17:30:41.412310 containerd[1538]: time="2025-03-17T17:30:41.412257385Z" level=info msg="StopPodSandbox for \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" returns successfully" Mar 17 17:30:41.413208 systemd[1]: run-netns-cni\x2dc069336c\x2d2060\x2d24ba\x2d56bb\x2df90379dc6e6a.mount: Deactivated successfully. Mar 17 17:30:41.415248 containerd[1538]: time="2025-03-17T17:30:41.414243830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:5,}" Mar 17 17:30:41.415248 containerd[1538]: time="2025-03-17T17:30:41.414370350Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\"" Mar 17 17:30:41.415248 containerd[1538]: time="2025-03-17T17:30:41.414698031Z" level=info msg="TearDown network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" successfully" Mar 17 17:30:41.415248 containerd[1538]: time="2025-03-17T17:30:41.414714471Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" returns successfully" Mar 17 17:30:41.415745 containerd[1538]: time="2025-03-17T17:30:41.415717633Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" Mar 17 17:30:41.415836 containerd[1538]: time="2025-03-17T17:30:41.415820433Z" level=info msg="TearDown network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" successfully" Mar 17 17:30:41.415915 containerd[1538]: time="2025-03-17T17:30:41.415835713Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" returns successfully" Mar 17 17:30:41.416047 kubelet[2762]: I0317 17:30:41.415986 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a" Mar 17 17:30:41.416172 containerd[1538]: time="2025-03-17T17:30:41.416153834Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:30:41.416254 containerd[1538]: time="2025-03-17T17:30:41.416228554Z" level=info msg="TearDown network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" successfully" Mar 17 17:30:41.416254 containerd[1538]: time="2025-03-17T17:30:41.416239394Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" returns successfully" Mar 17 17:30:41.416500 containerd[1538]: time="2025-03-17T17:30:41.416483355Z" level=info msg="StopPodSandbox for \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\"" Mar 17 17:30:41.416549 kubelet[2762]: E0317 17:30:41.416494 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:41.416832 containerd[1538]: time="2025-03-17T17:30:41.416780356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:5,}" Mar 17 17:30:41.417115 containerd[1538]: time="2025-03-17T17:30:41.416796916Z" level=info msg="Ensure that sandbox b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a in task-service has been cleanup successfully" Mar 17 17:30:41.417481 containerd[1538]: time="2025-03-17T17:30:41.417376517Z" level=info msg="TearDown network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\" successfully" Mar 17 17:30:41.417481 containerd[1538]: time="2025-03-17T17:30:41.417393917Z" level=info msg="StopPodSandbox for \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\" returns successfully" Mar 17 17:30:41.419350 containerd[1538]: time="2025-03-17T17:30:41.418703200Z" level=info msg="StopPodSandbox for \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\"" Mar 17 17:30:41.419350 containerd[1538]: time="2025-03-17T17:30:41.419231601Z" level=info msg="TearDown network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" successfully" Mar 17 17:30:41.419350 containerd[1538]: time="2025-03-17T17:30:41.419249761Z" level=info msg="StopPodSandbox for \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" returns successfully" Mar 17 17:30:41.420041 containerd[1538]: time="2025-03-17T17:30:41.420018203Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\"" Mar 17 17:30:41.420390 containerd[1538]: time="2025-03-17T17:30:41.420362844Z" level=info msg="TearDown network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" successfully" Mar 17 17:30:41.420807 containerd[1538]: time="2025-03-17T17:30:41.420748925Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" returns successfully" Mar 17 17:30:41.422107 kubelet[2762]: E0317 17:30:41.422078 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:41.422780 containerd[1538]: time="2025-03-17T17:30:41.422186128Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" Mar 17 17:30:41.422780 containerd[1538]: time="2025-03-17T17:30:41.422281488Z" level=info msg="TearDown network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" successfully" Mar 17 17:30:41.422780 containerd[1538]: time="2025-03-17T17:30:41.422291448Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" returns successfully" Mar 17 17:30:41.423900 containerd[1538]: time="2025-03-17T17:30:41.423367411Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:30:41.423900 containerd[1538]: time="2025-03-17T17:30:41.423456971Z" level=info msg="TearDown network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" successfully" Mar 17 17:30:41.423900 containerd[1538]: time="2025-03-17T17:30:41.423467731Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" returns successfully" Mar 17 17:30:41.424527 containerd[1538]: time="2025-03-17T17:30:41.424312613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:5,}" Mar 17 17:30:41.426662 kubelet[2762]: I0317 17:30:41.425030 2762 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1" Mar 17 17:30:41.426773 containerd[1538]: time="2025-03-17T17:30:41.425525616Z" level=info msg="StopPodSandbox for \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\"" Mar 17 17:30:41.426773 containerd[1538]: time="2025-03-17T17:30:41.425699496Z" level=info msg="Ensure that sandbox d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1 in task-service has been cleanup successfully" Mar 17 17:30:41.426773 containerd[1538]: time="2025-03-17T17:30:41.426024537Z" level=info msg="TearDown network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\" successfully" Mar 17 17:30:41.426773 containerd[1538]: time="2025-03-17T17:30:41.426041737Z" level=info msg="StopPodSandbox for \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\" returns successfully" Mar 17 17:30:41.426773 containerd[1538]: time="2025-03-17T17:30:41.426352738Z" level=info msg="StopPodSandbox for \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\"" Mar 17 17:30:41.426773 containerd[1538]: time="2025-03-17T17:30:41.426427058Z" level=info msg="TearDown network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" successfully" Mar 17 17:30:41.426773 containerd[1538]: time="2025-03-17T17:30:41.426436578Z" level=info msg="StopPodSandbox for \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" returns successfully" Mar 17 17:30:41.429214 containerd[1538]: time="2025-03-17T17:30:41.429185664Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\"" Mar 17 17:30:41.429287 containerd[1538]: time="2025-03-17T17:30:41.429280505Z" level=info msg="TearDown network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" successfully" Mar 17 17:30:41.429314 containerd[1538]: time="2025-03-17T17:30:41.429289945Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" returns successfully" Mar 17 17:30:41.429675 containerd[1538]: time="2025-03-17T17:30:41.429640665Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" Mar 17 17:30:41.429774 containerd[1538]: time="2025-03-17T17:30:41.429756666Z" level=info msg="TearDown network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" successfully" Mar 17 17:30:41.429801 containerd[1538]: time="2025-03-17T17:30:41.429772226Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" returns successfully" Mar 17 17:30:41.430563 containerd[1538]: time="2025-03-17T17:30:41.430537667Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:30:41.430639 containerd[1538]: time="2025-03-17T17:30:41.430623068Z" level=info msg="TearDown network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" successfully" Mar 17 17:30:41.430668 containerd[1538]: time="2025-03-17T17:30:41.430638548Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" returns successfully" Mar 17 17:30:41.431080 containerd[1538]: time="2025-03-17T17:30:41.431056309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:5,}" Mar 17 17:30:41.942956 systemd-networkd[1232]: calicf80693fa74: Link UP Mar 17 17:30:41.944979 systemd-networkd[1232]: calicf80693fa74: Gained carrier Mar 17 17:30:41.957385 kubelet[2762]: I0317 17:30:41.957307 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ncp6c" podStartSLOduration=1.802410286 podStartE2EDuration="13.957289726s" podCreationTimestamp="2025-03-17 17:30:28 +0000 UTC" firstStartedPulling="2025-03-17 17:30:28.560764235 +0000 UTC m=+25.456524124" lastFinishedPulling="2025-03-17 17:30:40.715643675 +0000 UTC m=+37.611403564" observedRunningTime="2025-03-17 17:30:41.441410293 +0000 UTC m=+38.337170182" watchObservedRunningTime="2025-03-17 17:30:41.957289726 +0000 UTC m=+38.853049615" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.550 [INFO][4900] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.574 [INFO][4900] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0 coredns-7db6d8ff4d- kube-system ff9f13ce-0246-4ffb-9085-bff61711b9fb 762 0 2025-03-17 17:30:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-zx22h eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicf80693fa74 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.574 [INFO][4900] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.850 [INFO][4944] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" HandleID="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Workload="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.887 [INFO][4944] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" HandleID="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Workload="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000325670), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-zx22h", "timestamp":"2025-03-17 17:30:41.844009744 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.887 [INFO][4944] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.887 [INFO][4944] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.890 [INFO][4944] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.894 [INFO][4944] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.912 [INFO][4944] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.916 [INFO][4944] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.918 [INFO][4944] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.920 [INFO][4944] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.920 [INFO][4944] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.921 [INFO][4944] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.925 [INFO][4944] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.930 [INFO][4944] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.930 [INFO][4944] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" host="localhost" Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.930 [INFO][4944] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:41.961364 containerd[1538]: 2025-03-17 17:30:41.930 [INFO][4944] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" HandleID="k8s-pod-network.2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Workload="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" Mar 17 17:30:41.962068 containerd[1538]: 2025-03-17 17:30:41.933 [INFO][4900] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"ff9f13ce-0246-4ffb-9085-bff61711b9fb", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-zx22h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicf80693fa74", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:41.962068 containerd[1538]: 2025-03-17 17:30:41.933 [INFO][4900] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" Mar 17 17:30:41.962068 containerd[1538]: 2025-03-17 17:30:41.933 [INFO][4900] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicf80693fa74 ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" Mar 17 17:30:41.962068 containerd[1538]: 2025-03-17 17:30:41.946 [INFO][4900] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" Mar 17 17:30:41.962068 containerd[1538]: 2025-03-17 17:30:41.948 [INFO][4900] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"ff9f13ce-0246-4ffb-9085-bff61711b9fb", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d", Pod:"coredns-7db6d8ff4d-zx22h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicf80693fa74", MAC:"92:dd:6c:b7:4e:c8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:41.962068 containerd[1538]: 2025-03-17 17:30:41.957 [INFO][4900] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-zx22h" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--zx22h-eth0" Mar 17 17:30:41.966978 systemd-networkd[1232]: cali0c0ec5c2ce1: Link UP Mar 17 17:30:41.967171 systemd-networkd[1232]: cali0c0ec5c2ce1: Gained carrier Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.517 [INFO][4859] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.555 [INFO][4859] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--ddmk9-eth0 csi-node-driver- calico-system 191fd0a9-26f3-46f0-864d-1b5b729dbb52 627 0 2025-03-17 17:30:28 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-ddmk9 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0c0ec5c2ce1 [] []}} ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.555 [INFO][4859] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-eth0" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.851 [INFO][4919] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" HandleID="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Workload="localhost-k8s-csi--node--driver--ddmk9-eth0" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.886 [INFO][4919] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" HandleID="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Workload="localhost-k8s-csi--node--driver--ddmk9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004185b0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-ddmk9", "timestamp":"2025-03-17 17:30:41.851571041 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.887 [INFO][4919] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.930 [INFO][4919] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.930 [INFO][4919] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.933 [INFO][4919] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.937 [INFO][4919] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.942 [INFO][4919] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.943 [INFO][4919] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.945 [INFO][4919] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.945 [INFO][4919] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.947 [INFO][4919] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.952 [INFO][4919] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.960 [INFO][4919] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.961 [INFO][4919] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" host="localhost" Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.961 [INFO][4919] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:41.980594 containerd[1538]: 2025-03-17 17:30:41.961 [INFO][4919] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" HandleID="k8s-pod-network.bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Workload="localhost-k8s-csi--node--driver--ddmk9-eth0" Mar 17 17:30:41.981291 containerd[1538]: 2025-03-17 17:30:41.963 [INFO][4859] cni-plugin/k8s.go 386: Populated endpoint ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ddmk9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"191fd0a9-26f3-46f0-864d-1b5b729dbb52", ResourceVersion:"627", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-ddmk9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0c0ec5c2ce1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:41.981291 containerd[1538]: 2025-03-17 17:30:41.963 [INFO][4859] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-eth0" Mar 17 17:30:41.981291 containerd[1538]: 2025-03-17 17:30:41.963 [INFO][4859] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0c0ec5c2ce1 ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-eth0" Mar 17 17:30:41.981291 containerd[1538]: 2025-03-17 17:30:41.967 [INFO][4859] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-eth0" Mar 17 17:30:41.981291 containerd[1538]: 2025-03-17 17:30:41.967 [INFO][4859] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ddmk9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"191fd0a9-26f3-46f0-864d-1b5b729dbb52", ResourceVersion:"627", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b", Pod:"csi-node-driver-ddmk9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0c0ec5c2ce1", MAC:"8a:cf:c1:ed:c0:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:41.981291 containerd[1538]: 2025-03-17 17:30:41.978 [INFO][4859] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b" Namespace="calico-system" Pod="csi-node-driver-ddmk9" WorkloadEndpoint="localhost-k8s-csi--node--driver--ddmk9-eth0" Mar 17 17:30:41.990832 containerd[1538]: time="2025-03-17T17:30:41.990433803Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:41.990832 containerd[1538]: time="2025-03-17T17:30:41.990530923Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:41.990832 containerd[1538]: time="2025-03-17T17:30:41.990548803Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:41.990832 containerd[1538]: time="2025-03-17T17:30:41.990661923Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.012230 systemd-networkd[1232]: calid1ecc893db5: Link UP Mar 17 17:30:42.013041 systemd-networkd[1232]: calid1ecc893db5: Gained carrier Mar 17 17:30:42.020638 containerd[1538]: time="2025-03-17T17:30:42.003043432Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:42.020638 containerd[1538]: time="2025-03-17T17:30:42.020029590Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:42.020638 containerd[1538]: time="2025-03-17T17:30:42.020058790Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.020638 containerd[1538]: time="2025-03-17T17:30:42.020215230Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.034438 systemd-resolved[1437]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.526 [INFO][4868] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.560 [INFO][4868] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0 calico-kube-controllers-586bfd9c56- calico-system 2e8bd960-5c0f-417d-9ed7-865fc7dd73d5 761 0 2025-03-17 17:30:28 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:586bfd9c56 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-586bfd9c56-zr9qf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid1ecc893db5 [] []}} ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.560 [INFO][4868] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.843 [INFO][4932] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" HandleID="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Workload="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.894 [INFO][4932] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" HandleID="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Workload="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c1f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-586bfd9c56-zr9qf", "timestamp":"2025-03-17 17:30:41.843786343 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.894 [INFO][4932] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.961 [INFO][4932] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.961 [INFO][4932] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.963 [INFO][4932] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.972 [INFO][4932] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.981 [INFO][4932] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.983 [INFO][4932] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.987 [INFO][4932] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.988 [INFO][4932] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.993 [INFO][4932] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68 Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:41.999 [INFO][4932] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:42.004 [INFO][4932] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:42.004 [INFO][4932] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" host="localhost" Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:42.004 [INFO][4932] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:42.037835 containerd[1538]: 2025-03-17 17:30:42.005 [INFO][4932] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" HandleID="k8s-pod-network.ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Workload="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" Mar 17 17:30:42.038415 containerd[1538]: 2025-03-17 17:30:42.008 [INFO][4868] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0", GenerateName:"calico-kube-controllers-586bfd9c56-", Namespace:"calico-system", SelfLink:"", UID:"2e8bd960-5c0f-417d-9ed7-865fc7dd73d5", ResourceVersion:"761", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"586bfd9c56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-586bfd9c56-zr9qf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid1ecc893db5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.038415 containerd[1538]: 2025-03-17 17:30:42.008 [INFO][4868] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" Mar 17 17:30:42.038415 containerd[1538]: 2025-03-17 17:30:42.009 [INFO][4868] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid1ecc893db5 ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" Mar 17 17:30:42.038415 containerd[1538]: 2025-03-17 17:30:42.013 [INFO][4868] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" Mar 17 17:30:42.038415 containerd[1538]: 2025-03-17 17:30:42.015 [INFO][4868] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0", GenerateName:"calico-kube-controllers-586bfd9c56-", Namespace:"calico-system", SelfLink:"", UID:"2e8bd960-5c0f-417d-9ed7-865fc7dd73d5", ResourceVersion:"761", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"586bfd9c56", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68", Pod:"calico-kube-controllers-586bfd9c56-zr9qf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid1ecc893db5", MAC:"c2:b0:07:6a:42:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.038415 containerd[1538]: 2025-03-17 17:30:42.034 [INFO][4868] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68" Namespace="calico-system" Pod="calico-kube-controllers-586bfd9c56-zr9qf" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--586bfd9c56--zr9qf-eth0" Mar 17 17:30:42.056443 systemd-resolved[1437]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:30:42.060345 containerd[1538]: time="2025-03-17T17:30:42.060306161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-zx22h,Uid:ff9f13ce-0246-4ffb-9085-bff61711b9fb,Namespace:kube-system,Attempt:5,} returns sandbox id \"2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d\"" Mar 17 17:30:42.061034 kubelet[2762]: E0317 17:30:42.061008 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:42.066240 containerd[1538]: time="2025-03-17T17:30:42.066193214Z" level=info msg="CreateContainer within sandbox \"2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:30:42.073350 systemd-networkd[1232]: cali33efb402993: Link UP Mar 17 17:30:42.073545 systemd-networkd[1232]: cali33efb402993: Gained carrier Mar 17 17:30:42.088765 containerd[1538]: time="2025-03-17T17:30:42.088728105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ddmk9,Uid:191fd0a9-26f3-46f0-864d-1b5b729dbb52,Namespace:calico-system,Attempt:5,} returns sandbox id \"bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b\"" Mar 17 17:30:42.092033 containerd[1538]: time="2025-03-17T17:30:42.090644629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:41.558 [INFO][4886] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:41.589 [INFO][4886] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0 calico-apiserver-57698566f9- calico-apiserver 511a8295-1573-4289-aeb3-f68df5d31b5a 756 0 2025-03-17 17:30:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57698566f9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-57698566f9-jhj7p eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali33efb402993 [] []}} ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:41.589 [INFO][4886] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:41.846 [INFO][4950] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" HandleID="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Workload="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:41.894 [INFO][4950] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" HandleID="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Workload="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40005483e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-57698566f9-jhj7p", "timestamp":"2025-03-17 17:30:41.846085749 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:41.894 [INFO][4950] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.005 [INFO][4950] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.006 [INFO][4950] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.009 [INFO][4950] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.017 [INFO][4950] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.027 [INFO][4950] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.035 [INFO][4950] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.039 [INFO][4950] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.039 [INFO][4950] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.041 [INFO][4950] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4 Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.052 [INFO][4950] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.066 [INFO][4950] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.067 [INFO][4950] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" host="localhost" Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.067 [INFO][4950] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:42.097874 containerd[1538]: 2025-03-17 17:30:42.067 [INFO][4950] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" HandleID="k8s-pod-network.db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Workload="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" Mar 17 17:30:42.098704 containerd[1538]: 2025-03-17 17:30:42.071 [INFO][4886] cni-plugin/k8s.go 386: Populated endpoint ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0", GenerateName:"calico-apiserver-57698566f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"511a8295-1573-4289-aeb3-f68df5d31b5a", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57698566f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-57698566f9-jhj7p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali33efb402993", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.098704 containerd[1538]: 2025-03-17 17:30:42.071 [INFO][4886] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" Mar 17 17:30:42.098704 containerd[1538]: 2025-03-17 17:30:42.071 [INFO][4886] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali33efb402993 ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" Mar 17 17:30:42.098704 containerd[1538]: 2025-03-17 17:30:42.073 [INFO][4886] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" Mar 17 17:30:42.098704 containerd[1538]: 2025-03-17 17:30:42.074 [INFO][4886] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0", GenerateName:"calico-apiserver-57698566f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"511a8295-1573-4289-aeb3-f68df5d31b5a", ResourceVersion:"756", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57698566f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4", Pod:"calico-apiserver-57698566f9-jhj7p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali33efb402993", MAC:"76:17:b3:62:03:4b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.098704 containerd[1538]: 2025-03-17 17:30:42.088 [INFO][4886] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-jhj7p" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--jhj7p-eth0" Mar 17 17:30:42.100541 containerd[1538]: time="2025-03-17T17:30:42.100464651Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:42.100541 containerd[1538]: time="2025-03-17T17:30:42.100520691Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:42.100647 containerd[1538]: time="2025-03-17T17:30:42.100563611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.102004 containerd[1538]: time="2025-03-17T17:30:42.101748814Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.127517 systemd-resolved[1437]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:30:42.138120 containerd[1538]: time="2025-03-17T17:30:42.138054816Z" level=info msg="CreateContainer within sandbox \"2c5586a84aac5af6d57ad04af486bd22ffbff91c3754c24538697365bd2b836d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"69ded59463b9ef2e63f32410f56bec6cd6ab29e8956126cc0a28d093ba6050ab\"" Mar 17 17:30:42.139225 systemd-networkd[1232]: cali91c26634882: Link UP Mar 17 17:30:42.140567 systemd-networkd[1232]: cali91c26634882: Gained carrier Mar 17 17:30:42.141280 containerd[1538]: time="2025-03-17T17:30:42.141239743Z" level=info msg="StartContainer for \"69ded59463b9ef2e63f32410f56bec6cd6ab29e8956126cc0a28d093ba6050ab\"" Mar 17 17:30:42.160862 containerd[1538]: time="2025-03-17T17:30:42.160739107Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:42.162053 containerd[1538]: time="2025-03-17T17:30:42.161999590Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:42.162053 containerd[1538]: time="2025-03-17T17:30:42.162019030Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.163985 containerd[1538]: time="2025-03-17T17:30:42.162734391Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:41.491 [INFO][4848] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:41.555 [INFO][4848] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0 coredns-7db6d8ff4d- kube-system abd10042-888c-4822-b83a-6040f4449647 764 0 2025-03-17 17:30:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-8tsrg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali91c26634882 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:41.555 [INFO][4848] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:41.844 [INFO][4921] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" HandleID="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Workload="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:41.894 [INFO][4921] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" HandleID="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Workload="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000395bd0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-8tsrg", "timestamp":"2025-03-17 17:30:41.844331385 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:41.894 [INFO][4921] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.067 [INFO][4921] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.067 [INFO][4921] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.075 [INFO][4921] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.092 [INFO][4921] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.098 [INFO][4921] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.100 [INFO][4921] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.108 [INFO][4921] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.108 [INFO][4921] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.110 [INFO][4921] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8 Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.117 [INFO][4921] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.129 [INFO][4921] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.129 [INFO][4921] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" host="localhost" Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.129 [INFO][4921] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:42.169488 containerd[1538]: 2025-03-17 17:30:42.129 [INFO][4921] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" HandleID="k8s-pod-network.3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Workload="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" Mar 17 17:30:42.170019 containerd[1538]: 2025-03-17 17:30:42.136 [INFO][4848] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"abd10042-888c-4822-b83a-6040f4449647", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-8tsrg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali91c26634882", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.170019 containerd[1538]: 2025-03-17 17:30:42.136 [INFO][4848] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" Mar 17 17:30:42.170019 containerd[1538]: 2025-03-17 17:30:42.136 [INFO][4848] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91c26634882 ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" Mar 17 17:30:42.170019 containerd[1538]: 2025-03-17 17:30:42.141 [INFO][4848] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" Mar 17 17:30:42.170019 containerd[1538]: 2025-03-17 17:30:42.147 [INFO][4848] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"abd10042-888c-4822-b83a-6040f4449647", ResourceVersion:"764", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8", Pod:"coredns-7db6d8ff4d-8tsrg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali91c26634882", MAC:"8a:b0:76:a8:f8:07", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.170019 containerd[1538]: 2025-03-17 17:30:42.166 [INFO][4848] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8tsrg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--8tsrg-eth0" Mar 17 17:30:42.181531 systemd[1]: run-netns-cni\x2da4e70628\x2d0449\x2d451a\x2d111d\x2da3a73e476650.mount: Deactivated successfully. Mar 17 17:30:42.181669 systemd[1]: run-netns-cni\x2d66c123cf\x2db896\x2d7c9b\x2d654b\x2de55e542f0081.mount: Deactivated successfully. Mar 17 17:30:42.190649 systemd[1]: Started sshd@9-10.0.0.79:22-10.0.0.1:57442.service - OpenSSH per-connection server daemon (10.0.0.1:57442). Mar 17 17:30:42.204637 systemd-resolved[1437]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:30:42.212993 containerd[1538]: time="2025-03-17T17:30:42.212441943Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-586bfd9c56-zr9qf,Uid:2e8bd960-5c0f-417d-9ed7-865fc7dd73d5,Namespace:calico-system,Attempt:5,} returns sandbox id \"ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68\"" Mar 17 17:30:42.221737 systemd-networkd[1232]: cali7a92a556712: Link UP Mar 17 17:30:42.224005 systemd-networkd[1232]: cali7a92a556712: Gained carrier Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:41.456 [INFO][4835] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:41.561 [INFO][4835] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0 calico-apiserver-57698566f9- calico-apiserver dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d 763 0 2025-03-17 17:30:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:57698566f9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-57698566f9-8n7jx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7a92a556712 [] []}} ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:41.561 [INFO][4835] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:41.843 [INFO][4925] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" HandleID="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Workload="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:41.897 [INFO][4925] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" HandleID="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Workload="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002e56d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-57698566f9-8n7jx", "timestamp":"2025-03-17 17:30:41.843753783 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:41.897 [INFO][4925] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.132 [INFO][4925] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.133 [INFO][4925] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.138 [INFO][4925] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.144 [INFO][4925] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.156 [INFO][4925] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.171 [INFO][4925] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.174 [INFO][4925] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.174 [INFO][4925] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.179 [INFO][4925] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4 Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.185 [INFO][4925] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.196 [INFO][4925] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.196 [INFO][4925] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" host="localhost" Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.196 [INFO][4925] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:30:42.242825 containerd[1538]: 2025-03-17 17:30:42.196 [INFO][4925] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" HandleID="k8s-pod-network.36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Workload="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" Mar 17 17:30:42.244019 containerd[1538]: 2025-03-17 17:30:42.203 [INFO][4835] cni-plugin/k8s.go 386: Populated endpoint ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0", GenerateName:"calico-apiserver-57698566f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57698566f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-57698566f9-8n7jx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7a92a556712", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.244019 containerd[1538]: 2025-03-17 17:30:42.208 [INFO][4835] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" Mar 17 17:30:42.244019 containerd[1538]: 2025-03-17 17:30:42.208 [INFO][4835] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a92a556712 ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" Mar 17 17:30:42.244019 containerd[1538]: 2025-03-17 17:30:42.224 [INFO][4835] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" Mar 17 17:30:42.244019 containerd[1538]: 2025-03-17 17:30:42.225 [INFO][4835] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0", GenerateName:"calico-apiserver-57698566f9-", Namespace:"calico-apiserver", SelfLink:"", UID:"dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 30, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"57698566f9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4", Pod:"calico-apiserver-57698566f9-8n7jx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7a92a556712", MAC:"8a:63:d9:53:32:4f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:30:42.244019 containerd[1538]: 2025-03-17 17:30:42.235 [INFO][4835] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4" Namespace="calico-apiserver" Pod="calico-apiserver-57698566f9-8n7jx" WorkloadEndpoint="localhost-k8s-calico--apiserver--57698566f9--8n7jx-eth0" Mar 17 17:30:42.245267 containerd[1538]: time="2025-03-17T17:30:42.245054497Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:42.245562 containerd[1538]: time="2025-03-17T17:30:42.245421458Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:42.245562 containerd[1538]: time="2025-03-17T17:30:42.245449538Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.246152 containerd[1538]: time="2025-03-17T17:30:42.246087099Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.252632 containerd[1538]: time="2025-03-17T17:30:42.252580834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-jhj7p,Uid:511a8295-1573-4289-aeb3-f68df5d31b5a,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4\"" Mar 17 17:30:42.268886 sshd[5178]: Accepted publickey for core from 10.0.0.1 port 57442 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:42.274118 sshd-session[5178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:42.278208 systemd-resolved[1437]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:30:42.284842 systemd-logind[1523]: New session 10 of user core. Mar 17 17:30:42.286267 containerd[1538]: time="2025-03-17T17:30:42.285940629Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:30:42.286267 containerd[1538]: time="2025-03-17T17:30:42.286031429Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:30:42.286267 containerd[1538]: time="2025-03-17T17:30:42.286043509Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.286267 containerd[1538]: time="2025-03-17T17:30:42.286165349Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:30:42.290170 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 17 17:30:42.309402 containerd[1538]: time="2025-03-17T17:30:42.309354682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8tsrg,Uid:abd10042-888c-4822-b83a-6040f4449647,Namespace:kube-system,Attempt:5,} returns sandbox id \"3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8\"" Mar 17 17:30:42.309518 containerd[1538]: time="2025-03-17T17:30:42.309383842Z" level=info msg="StartContainer for \"69ded59463b9ef2e63f32410f56bec6cd6ab29e8956126cc0a28d093ba6050ab\" returns successfully" Mar 17 17:30:42.311723 kubelet[2762]: E0317 17:30:42.311693 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:42.314734 containerd[1538]: time="2025-03-17T17:30:42.314266173Z" level=info msg="CreateContainer within sandbox \"3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:30:42.321728 systemd-resolved[1437]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:30:42.347518 containerd[1538]: time="2025-03-17T17:30:42.347430607Z" level=info msg="CreateContainer within sandbox \"3c96f7843205c95e36b0aa0826c96116cc5eeac27e526b7eea8165c7e86292c8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"222804d94b77db2729ca3c0b6a12a491fecf430754d4ce8c37591b863fead4b4\"" Mar 17 17:30:42.349655 containerd[1538]: time="2025-03-17T17:30:42.348212009Z" level=info msg="StartContainer for \"222804d94b77db2729ca3c0b6a12a491fecf430754d4ce8c37591b863fead4b4\"" Mar 17 17:30:42.349655 containerd[1538]: time="2025-03-17T17:30:42.348593810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-57698566f9-8n7jx,Uid:dd7482c6-d26c-4db4-bdd4-f4d33b8a6c7d,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4\"" Mar 17 17:30:42.435207 kubelet[2762]: E0317 17:30:42.434998 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:42.507978 kubelet[2762]: I0317 17:30:42.503294 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:42.507978 kubelet[2762]: E0317 17:30:42.504118 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:42.599220 containerd[1538]: time="2025-03-17T17:30:42.599168894Z" level=info msg="StartContainer for \"222804d94b77db2729ca3c0b6a12a491fecf430754d4ce8c37591b863fead4b4\" returns successfully" Mar 17 17:30:42.680381 sshd[5296]: Connection closed by 10.0.0.1 port 57442 Mar 17 17:30:42.680904 sshd-session[5178]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:42.690222 systemd[1]: Started sshd@10-10.0.0.79:22-10.0.0.1:47034.service - OpenSSH per-connection server daemon (10.0.0.1:47034). Mar 17 17:30:42.690605 systemd[1]: sshd@9-10.0.0.79:22-10.0.0.1:57442.service: Deactivated successfully. Mar 17 17:30:42.693806 systemd-logind[1523]: Session 10 logged out. Waiting for processes to exit. Mar 17 17:30:42.693925 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 17:30:42.700624 systemd-logind[1523]: Removed session 10. Mar 17 17:30:42.757860 sshd[5484]: Accepted publickey for core from 10.0.0.1 port 47034 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:42.760272 sshd-session[5484]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:42.772488 systemd-logind[1523]: New session 11 of user core. Mar 17 17:30:42.776984 kernel: bpftool[5520]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 17:30:42.778196 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 17 17:30:42.952228 systemd-networkd[1232]: vxlan.calico: Link UP Mar 17 17:30:42.952238 systemd-networkd[1232]: vxlan.calico: Gained carrier Mar 17 17:30:43.028118 sshd[5522]: Connection closed by 10.0.0.1 port 47034 Mar 17 17:30:43.029242 sshd-session[5484]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:43.037452 systemd[1]: Started sshd@11-10.0.0.79:22-10.0.0.1:47038.service - OpenSSH per-connection server daemon (10.0.0.1:47038). Mar 17 17:30:43.037883 systemd[1]: sshd@10-10.0.0.79:22-10.0.0.1:47034.service: Deactivated successfully. Mar 17 17:30:43.053163 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 17:30:43.055306 systemd-logind[1523]: Session 11 logged out. Waiting for processes to exit. Mar 17 17:30:43.057375 systemd-logind[1523]: Removed session 11. Mar 17 17:30:43.111386 sshd[5567]: Accepted publickey for core from 10.0.0.1 port 47038 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:43.112968 sshd-session[5567]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:43.119143 systemd-logind[1523]: New session 12 of user core. Mar 17 17:30:43.131252 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 17 17:30:43.312165 systemd-networkd[1232]: calid1ecc893db5: Gained IPv6LL Mar 17 17:30:43.350191 sshd[5587]: Connection closed by 10.0.0.1 port 47038 Mar 17 17:30:43.349325 sshd-session[5567]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:43.352554 systemd[1]: sshd@11-10.0.0.79:22-10.0.0.1:47038.service: Deactivated successfully. Mar 17 17:30:43.356654 systemd-logind[1523]: Session 12 logged out. Waiting for processes to exit. Mar 17 17:30:43.357148 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 17:30:43.358832 systemd-logind[1523]: Removed session 12. Mar 17 17:30:43.377032 systemd-networkd[1232]: cali91c26634882: Gained IPv6LL Mar 17 17:30:43.414242 containerd[1538]: time="2025-03-17T17:30:43.414189266Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:43.414762 containerd[1538]: time="2025-03-17T17:30:43.414708267Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 17 17:30:43.415639 containerd[1538]: time="2025-03-17T17:30:43.415601389Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:43.418040 containerd[1538]: time="2025-03-17T17:30:43.418002674Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:43.418766 containerd[1538]: time="2025-03-17T17:30:43.418726676Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.328044927s" Mar 17 17:30:43.418766 containerd[1538]: time="2025-03-17T17:30:43.418761556Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 17 17:30:43.419738 containerd[1538]: time="2025-03-17T17:30:43.419553278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 17 17:30:43.422484 containerd[1538]: time="2025-03-17T17:30:43.422450484Z" level=info msg="CreateContainer within sandbox \"bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 17:30:43.434601 containerd[1538]: time="2025-03-17T17:30:43.434547551Z" level=info msg="CreateContainer within sandbox \"bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"18286cc5ce5f0158fb8f364abcec467157945e1942e0b050eafb8a7e60446196\"" Mar 17 17:30:43.435065 containerd[1538]: time="2025-03-17T17:30:43.435011592Z" level=info msg="StartContainer for \"18286cc5ce5f0158fb8f364abcec467157945e1942e0b050eafb8a7e60446196\"" Mar 17 17:30:43.436269 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2246900146.mount: Deactivated successfully. Mar 17 17:30:43.441248 systemd-networkd[1232]: cali7a92a556712: Gained IPv6LL Mar 17 17:30:43.504191 systemd-networkd[1232]: cali0c0ec5c2ce1: Gained IPv6LL Mar 17 17:30:43.514249 containerd[1538]: time="2025-03-17T17:30:43.514166885Z" level=info msg="StartContainer for \"18286cc5ce5f0158fb8f364abcec467157945e1942e0b050eafb8a7e60446196\" returns successfully" Mar 17 17:30:43.526788 kubelet[2762]: E0317 17:30:43.526419 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:43.526788 kubelet[2762]: E0317 17:30:43.526577 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:43.541830 kubelet[2762]: I0317 17:30:43.541534 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-zx22h" podStartSLOduration=24.541515505 podStartE2EDuration="24.541515505s" podCreationTimestamp="2025-03-17 17:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:42.477652541 +0000 UTC m=+39.373412430" watchObservedRunningTime="2025-03-17 17:30:43.541515505 +0000 UTC m=+40.437275354" Mar 17 17:30:43.555367 kubelet[2762]: I0317 17:30:43.554900 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-8tsrg" podStartSLOduration=24.554855215 podStartE2EDuration="24.554855215s" podCreationTimestamp="2025-03-17 17:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:30:43.541609746 +0000 UTC m=+40.437369635" watchObservedRunningTime="2025-03-17 17:30:43.554855215 +0000 UTC m=+40.450615064" Mar 17 17:30:43.568080 systemd-networkd[1232]: cali33efb402993: Gained IPv6LL Mar 17 17:30:43.631997 systemd-networkd[1232]: calicf80693fa74: Gained IPv6LL Mar 17 17:30:44.533895 kubelet[2762]: E0317 17:30:44.531405 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:44.533895 kubelet[2762]: E0317 17:30:44.532202 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:44.656004 systemd-networkd[1232]: vxlan.calico: Gained IPv6LL Mar 17 17:30:45.127192 containerd[1538]: time="2025-03-17T17:30:45.126418237Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:45.127192 containerd[1538]: time="2025-03-17T17:30:45.127149838Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 17 17:30:45.127940 containerd[1538]: time="2025-03-17T17:30:45.127914360Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:45.130821 containerd[1538]: time="2025-03-17T17:30:45.130750166Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:45.131323 containerd[1538]: time="2025-03-17T17:30:45.131278807Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 1.711686129s" Mar 17 17:30:45.131323 containerd[1538]: time="2025-03-17T17:30:45.131317647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 17 17:30:45.132854 containerd[1538]: time="2025-03-17T17:30:45.132763650Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:30:45.139384 containerd[1538]: time="2025-03-17T17:30:45.139305944Z" level=info msg="CreateContainer within sandbox \"ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 17:30:45.148193 containerd[1538]: time="2025-03-17T17:30:45.148153442Z" level=info msg="CreateContainer within sandbox \"ca52b1a7d2341a0e0b1437c5ff3cb481513eb37232c80ff565800f73b76efc68\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"73d053f6fd3f8a3789eaa1a2e7f6f3a3db5a0f456ec04ebadfe215719573792e\"" Mar 17 17:30:45.148629 containerd[1538]: time="2025-03-17T17:30:45.148610643Z" level=info msg="StartContainer for \"73d053f6fd3f8a3789eaa1a2e7f6f3a3db5a0f456ec04ebadfe215719573792e\"" Mar 17 17:30:45.228229 containerd[1538]: time="2025-03-17T17:30:45.227169847Z" level=info msg="StartContainer for \"73d053f6fd3f8a3789eaa1a2e7f6f3a3db5a0f456ec04ebadfe215719573792e\" returns successfully" Mar 17 17:30:45.537336 kubelet[2762]: E0317 17:30:45.536954 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:45.609110 kubelet[2762]: I0317 17:30:45.609031 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-586bfd9c56-zr9qf" podStartSLOduration=14.692888547999999 podStartE2EDuration="17.609014326s" podCreationTimestamp="2025-03-17 17:30:28 +0000 UTC" firstStartedPulling="2025-03-17 17:30:42.215967591 +0000 UTC m=+39.111727480" lastFinishedPulling="2025-03-17 17:30:45.132093369 +0000 UTC m=+42.027853258" observedRunningTime="2025-03-17 17:30:45.607809563 +0000 UTC m=+42.503569452" watchObservedRunningTime="2025-03-17 17:30:45.609014326 +0000 UTC m=+42.504774215" Mar 17 17:30:47.237814 containerd[1538]: time="2025-03-17T17:30:47.237755781Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:47.238512 containerd[1538]: time="2025-03-17T17:30:47.238461383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 17 17:30:47.239521 containerd[1538]: time="2025-03-17T17:30:47.239479545Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:47.242514 containerd[1538]: time="2025-03-17T17:30:47.242475631Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:47.243886 containerd[1538]: time="2025-03-17T17:30:47.243321312Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 2.110519702s" Mar 17 17:30:47.243886 containerd[1538]: time="2025-03-17T17:30:47.243369513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:30:47.245956 containerd[1538]: time="2025-03-17T17:30:47.245910918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:30:47.247764 containerd[1538]: time="2025-03-17T17:30:47.247723921Z" level=info msg="CreateContainer within sandbox \"db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:30:47.280492 containerd[1538]: time="2025-03-17T17:30:47.280340106Z" level=info msg="CreateContainer within sandbox \"db13b8c19072f11f895de40cf37d30cf3b20814d766be38253b8ff5ea465ddb4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ac5d0152b07aa0226c7910de721b28624047d23492ebc9664b8373340b31977a\"" Mar 17 17:30:47.281142 containerd[1538]: time="2025-03-17T17:30:47.281030268Z" level=info msg="StartContainer for \"ac5d0152b07aa0226c7910de721b28624047d23492ebc9664b8373340b31977a\"" Mar 17 17:30:47.346550 containerd[1538]: time="2025-03-17T17:30:47.346429679Z" level=info msg="StartContainer for \"ac5d0152b07aa0226c7910de721b28624047d23492ebc9664b8373340b31977a\" returns successfully" Mar 17 17:30:47.552945 containerd[1538]: time="2025-03-17T17:30:47.552819091Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:47.554380 containerd[1538]: time="2025-03-17T17:30:47.553908653Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 17 17:30:47.560545 kubelet[2762]: I0317 17:30:47.558529 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57698566f9-jhj7p" podStartSLOduration=16.569066387 podStartE2EDuration="21.558510782s" podCreationTimestamp="2025-03-17 17:30:26 +0000 UTC" firstStartedPulling="2025-03-17 17:30:42.255707641 +0000 UTC m=+39.151467490" lastFinishedPulling="2025-03-17 17:30:47.245151996 +0000 UTC m=+44.140911885" observedRunningTime="2025-03-17 17:30:47.555151256 +0000 UTC m=+44.450911145" watchObservedRunningTime="2025-03-17 17:30:47.558510782 +0000 UTC m=+44.454270631" Mar 17 17:30:47.567605 containerd[1538]: time="2025-03-17T17:30:47.567550281Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 321.593323ms" Mar 17 17:30:47.567605 containerd[1538]: time="2025-03-17T17:30:47.567610441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:30:47.569428 containerd[1538]: time="2025-03-17T17:30:47.568914083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 17:30:47.571076 containerd[1538]: time="2025-03-17T17:30:47.571028007Z" level=info msg="CreateContainer within sandbox \"36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:30:47.587630 containerd[1538]: time="2025-03-17T17:30:47.587585521Z" level=info msg="CreateContainer within sandbox \"36593501f8a6489f446f47b87293a94e96f83dd902f7740d79f30a7956a972e4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a1237ba402c59c38e62cda02dc203e7f4ca0ea5be9e551b22fdd4ae269e76cbf\"" Mar 17 17:30:47.588924 containerd[1538]: time="2025-03-17T17:30:47.588315042Z" level=info msg="StartContainer for \"a1237ba402c59c38e62cda02dc203e7f4ca0ea5be9e551b22fdd4ae269e76cbf\"" Mar 17 17:30:47.660461 containerd[1538]: time="2025-03-17T17:30:47.660423466Z" level=info msg="StartContainer for \"a1237ba402c59c38e62cda02dc203e7f4ca0ea5be9e551b22fdd4ae269e76cbf\" returns successfully" Mar 17 17:30:48.363164 systemd[1]: Started sshd@12-10.0.0.79:22-10.0.0.1:47054.service - OpenSSH per-connection server daemon (10.0.0.1:47054). Mar 17 17:30:48.412206 sshd[5831]: Accepted publickey for core from 10.0.0.1 port 47054 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:48.414325 sshd-session[5831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:48.421571 systemd-logind[1523]: New session 13 of user core. Mar 17 17:30:48.427236 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 17 17:30:48.548220 kubelet[2762]: I0317 17:30:48.548177 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:48.562003 kubelet[2762]: I0317 17:30:48.561941 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-57698566f9-8n7jx" podStartSLOduration=17.346298542 podStartE2EDuration="22.561925885s" podCreationTimestamp="2025-03-17 17:30:26 +0000 UTC" firstStartedPulling="2025-03-17 17:30:42.352817059 +0000 UTC m=+39.248576908" lastFinishedPulling="2025-03-17 17:30:47.568444362 +0000 UTC m=+44.464204251" observedRunningTime="2025-03-17 17:30:48.560655042 +0000 UTC m=+45.456414931" watchObservedRunningTime="2025-03-17 17:30:48.561925885 +0000 UTC m=+45.457685774" Mar 17 17:30:48.629802 sshd[5834]: Connection closed by 10.0.0.1 port 47054 Mar 17 17:30:48.632091 sshd-session[5831]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:48.639244 systemd[1]: Started sshd@13-10.0.0.79:22-10.0.0.1:47062.service - OpenSSH per-connection server daemon (10.0.0.1:47062). Mar 17 17:30:48.639652 systemd[1]: sshd@12-10.0.0.79:22-10.0.0.1:47054.service: Deactivated successfully. Mar 17 17:30:48.642985 systemd-logind[1523]: Session 13 logged out. Waiting for processes to exit. Mar 17 17:30:48.644039 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 17:30:48.644808 systemd-logind[1523]: Removed session 13. Mar 17 17:30:48.678487 sshd[5845]: Accepted publickey for core from 10.0.0.1 port 47062 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:48.679690 sshd-session[5845]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:48.686414 systemd-logind[1523]: New session 14 of user core. Mar 17 17:30:48.693299 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 17 17:30:49.000703 containerd[1538]: time="2025-03-17T17:30:48.999686582Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:49.001378 containerd[1538]: time="2025-03-17T17:30:49.000769784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 17 17:30:49.003374 containerd[1538]: time="2025-03-17T17:30:49.003301268Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:49.013730 containerd[1538]: time="2025-03-17T17:30:49.013658088Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:30:49.014790 containerd[1538]: time="2025-03-17T17:30:49.014426010Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.445297486s" Mar 17 17:30:49.014790 containerd[1538]: time="2025-03-17T17:30:49.014476130Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 17 17:30:49.018037 containerd[1538]: time="2025-03-17T17:30:49.018004497Z" level=info msg="CreateContainer within sandbox \"bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 17:30:49.037691 containerd[1538]: time="2025-03-17T17:30:49.037639214Z" level=info msg="CreateContainer within sandbox \"bf04a73e378245fcd3f89366466cd3e40ff584aadd939dcee954373b19a5395b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"8c33c06c34330775769c3db985bc458a6da7d4b3e59e3d38af6a6e440e0b9b6a\"" Mar 17 17:30:49.041334 containerd[1538]: time="2025-03-17T17:30:49.039096497Z" level=info msg="StartContainer for \"8c33c06c34330775769c3db985bc458a6da7d4b3e59e3d38af6a6e440e0b9b6a\"" Mar 17 17:30:49.100507 sshd[5851]: Connection closed by 10.0.0.1 port 47062 Mar 17 17:30:49.100852 sshd-session[5845]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:49.112559 systemd[1]: Started sshd@14-10.0.0.79:22-10.0.0.1:47072.service - OpenSSH per-connection server daemon (10.0.0.1:47072). Mar 17 17:30:49.113945 systemd[1]: sshd@13-10.0.0.79:22-10.0.0.1:47062.service: Deactivated successfully. Mar 17 17:30:49.119279 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 17:30:49.122812 systemd-logind[1523]: Session 14 logged out. Waiting for processes to exit. Mar 17 17:30:49.128547 systemd-logind[1523]: Removed session 14. Mar 17 17:30:49.138431 containerd[1538]: time="2025-03-17T17:30:49.138274407Z" level=info msg="StartContainer for \"8c33c06c34330775769c3db985bc458a6da7d4b3e59e3d38af6a6e440e0b9b6a\" returns successfully" Mar 17 17:30:49.169060 sshd[5886]: Accepted publickey for core from 10.0.0.1 port 47072 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:49.170445 sshd-session[5886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:49.180681 systemd-logind[1523]: New session 15 of user core. Mar 17 17:30:49.187149 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 17 17:30:49.290888 kubelet[2762]: I0317 17:30:49.290741 2762 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 17:30:49.296417 kubelet[2762]: I0317 17:30:49.296379 2762 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 17:30:49.583545 kubelet[2762]: I0317 17:30:49.583484 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-ddmk9" podStartSLOduration=14.658166757 podStartE2EDuration="21.583467181s" podCreationTimestamp="2025-03-17 17:30:28 +0000 UTC" firstStartedPulling="2025-03-17 17:30:42.090018468 +0000 UTC m=+38.985778357" lastFinishedPulling="2025-03-17 17:30:49.015318892 +0000 UTC m=+45.911078781" observedRunningTime="2025-03-17 17:30:49.583063501 +0000 UTC m=+46.478823390" watchObservedRunningTime="2025-03-17 17:30:49.583467181 +0000 UTC m=+46.479227070" Mar 17 17:30:50.608194 sshd[5904]: Connection closed by 10.0.0.1 port 47072 Mar 17 17:30:50.609257 sshd-session[5886]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:50.613622 systemd[1]: sshd@14-10.0.0.79:22-10.0.0.1:47072.service: Deactivated successfully. Mar 17 17:30:50.616002 systemd-logind[1523]: Session 15 logged out. Waiting for processes to exit. Mar 17 17:30:50.621409 systemd[1]: Started sshd@15-10.0.0.79:22-10.0.0.1:47082.service - OpenSSH per-connection server daemon (10.0.0.1:47082). Mar 17 17:30:50.621843 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 17:30:50.625478 systemd-logind[1523]: Removed session 15. Mar 17 17:30:50.675983 sshd[5928]: Accepted publickey for core from 10.0.0.1 port 47082 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:50.677485 sshd-session[5928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:50.681549 systemd-logind[1523]: New session 16 of user core. Mar 17 17:30:50.693230 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 17 17:30:51.019512 sshd[5931]: Connection closed by 10.0.0.1 port 47082 Mar 17 17:30:51.020118 sshd-session[5928]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:51.030357 systemd[1]: Started sshd@16-10.0.0.79:22-10.0.0.1:47088.service - OpenSSH per-connection server daemon (10.0.0.1:47088). Mar 17 17:30:51.030763 systemd[1]: sshd@15-10.0.0.79:22-10.0.0.1:47082.service: Deactivated successfully. Mar 17 17:30:51.035513 systemd-logind[1523]: Session 16 logged out. Waiting for processes to exit. Mar 17 17:30:51.035901 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 17:30:51.040028 systemd-logind[1523]: Removed session 16. Mar 17 17:30:51.075218 sshd[5939]: Accepted publickey for core from 10.0.0.1 port 47088 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:51.076825 sshd-session[5939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:51.081666 systemd-logind[1523]: New session 17 of user core. Mar 17 17:30:51.093183 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 17 17:30:51.242833 sshd[5945]: Connection closed by 10.0.0.1 port 47088 Mar 17 17:30:51.243204 sshd-session[5939]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:51.246956 systemd[1]: sshd@16-10.0.0.79:22-10.0.0.1:47088.service: Deactivated successfully. Mar 17 17:30:51.249116 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 17:30:51.249771 systemd-logind[1523]: Session 17 logged out. Waiting for processes to exit. Mar 17 17:30:51.251327 systemd-logind[1523]: Removed session 17. Mar 17 17:30:53.554936 kubelet[2762]: I0317 17:30:53.554746 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:30:53.555612 kubelet[2762]: E0317 17:30:53.555569 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:53.630398 kubelet[2762]: E0317 17:30:53.630363 2762 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 17:30:56.254147 systemd[1]: Started sshd@17-10.0.0.79:22-10.0.0.1:47386.service - OpenSSH per-connection server daemon (10.0.0.1:47386). Mar 17 17:30:56.291544 sshd[6015]: Accepted publickey for core from 10.0.0.1 port 47386 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:30:56.293039 sshd-session[6015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:30:56.296934 systemd-logind[1523]: New session 18 of user core. Mar 17 17:30:56.304218 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 17 17:30:56.434244 sshd[6018]: Connection closed by 10.0.0.1 port 47386 Mar 17 17:30:56.434641 sshd-session[6015]: pam_unix(sshd:session): session closed for user core Mar 17 17:30:56.437527 systemd-logind[1523]: Session 18 logged out. Waiting for processes to exit. Mar 17 17:30:56.437648 systemd[1]: sshd@17-10.0.0.79:22-10.0.0.1:47386.service: Deactivated successfully. Mar 17 17:30:56.440535 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 17:30:56.441247 systemd-logind[1523]: Removed session 18. Mar 17 17:31:01.455197 systemd[1]: Started sshd@18-10.0.0.79:22-10.0.0.1:47402.service - OpenSSH per-connection server daemon (10.0.0.1:47402). Mar 17 17:31:01.501592 sshd[6052]: Accepted publickey for core from 10.0.0.1 port 47402 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:31:01.504809 sshd-session[6052]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:31:01.509629 systemd-logind[1523]: New session 19 of user core. Mar 17 17:31:01.523220 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 17 17:31:01.718437 sshd[6055]: Connection closed by 10.0.0.1 port 47402 Mar 17 17:31:01.719034 sshd-session[6052]: pam_unix(sshd:session): session closed for user core Mar 17 17:31:01.722827 systemd[1]: sshd@18-10.0.0.79:22-10.0.0.1:47402.service: Deactivated successfully. Mar 17 17:31:01.725075 systemd-logind[1523]: Session 19 logged out. Waiting for processes to exit. Mar 17 17:31:01.725176 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 17:31:01.726660 systemd-logind[1523]: Removed session 19. Mar 17 17:31:03.180979 containerd[1538]: time="2025-03-17T17:31:03.180916302Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:31:03.181404 containerd[1538]: time="2025-03-17T17:31:03.181041902Z" level=info msg="TearDown network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" successfully" Mar 17 17:31:03.181404 containerd[1538]: time="2025-03-17T17:31:03.181054742Z" level=info msg="StopPodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" returns successfully" Mar 17 17:31:03.182473 containerd[1538]: time="2025-03-17T17:31:03.181460862Z" level=info msg="RemovePodSandbox for \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:31:03.182473 containerd[1538]: time="2025-03-17T17:31:03.181500022Z" level=info msg="Forcibly stopping sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\"" Mar 17 17:31:03.182473 containerd[1538]: time="2025-03-17T17:31:03.181571783Z" level=info msg="TearDown network for sandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" successfully" Mar 17 17:31:03.184537 containerd[1538]: time="2025-03-17T17:31:03.184500507Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.184624 containerd[1538]: time="2025-03-17T17:31:03.184578907Z" level=info msg="RemovePodSandbox \"6732499f8a23a6ddb3cc3270490dd45aa4aaefd6b598b6f80da430fa55ae1a6a\" returns successfully" Mar 17 17:31:03.185080 containerd[1538]: time="2025-03-17T17:31:03.185052628Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" Mar 17 17:31:03.185172 containerd[1538]: time="2025-03-17T17:31:03.185157548Z" level=info msg="TearDown network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" successfully" Mar 17 17:31:03.185208 containerd[1538]: time="2025-03-17T17:31:03.185172228Z" level=info msg="StopPodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" returns successfully" Mar 17 17:31:03.186385 containerd[1538]: time="2025-03-17T17:31:03.186348030Z" level=info msg="RemovePodSandbox for \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" Mar 17 17:31:03.186429 containerd[1538]: time="2025-03-17T17:31:03.186394230Z" level=info msg="Forcibly stopping sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\"" Mar 17 17:31:03.186488 containerd[1538]: time="2025-03-17T17:31:03.186473230Z" level=info msg="TearDown network for sandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" successfully" Mar 17 17:31:03.191478 containerd[1538]: time="2025-03-17T17:31:03.191376878Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.191478 containerd[1538]: time="2025-03-17T17:31:03.191445718Z" level=info msg="RemovePodSandbox \"347a1ffdb98887a0a58bd574bacda4c17f4034af3734eedc714ff26d900033de\" returns successfully" Mar 17 17:31:03.192250 containerd[1538]: time="2025-03-17T17:31:03.192159039Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\"" Mar 17 17:31:03.192322 containerd[1538]: time="2025-03-17T17:31:03.192260839Z" level=info msg="TearDown network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" successfully" Mar 17 17:31:03.192322 containerd[1538]: time="2025-03-17T17:31:03.192271959Z" level=info msg="StopPodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" returns successfully" Mar 17 17:31:03.192703 containerd[1538]: time="2025-03-17T17:31:03.192673240Z" level=info msg="RemovePodSandbox for \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\"" Mar 17 17:31:03.192758 containerd[1538]: time="2025-03-17T17:31:03.192707320Z" level=info msg="Forcibly stopping sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\"" Mar 17 17:31:03.194472 containerd[1538]: time="2025-03-17T17:31:03.194412723Z" level=info msg="TearDown network for sandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" successfully" Mar 17 17:31:03.201409 containerd[1538]: time="2025-03-17T17:31:03.201353334Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.201537 containerd[1538]: time="2025-03-17T17:31:03.201436974Z" level=info msg="RemovePodSandbox \"d4f019846672f8dd0969aaa80e53a7cf85ac62b1804ca95d0da7dca3dec923d2\" returns successfully" Mar 17 17:31:03.202375 containerd[1538]: time="2025-03-17T17:31:03.201898815Z" level=info msg="StopPodSandbox for \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\"" Mar 17 17:31:03.202375 containerd[1538]: time="2025-03-17T17:31:03.202008295Z" level=info msg="TearDown network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" successfully" Mar 17 17:31:03.202375 containerd[1538]: time="2025-03-17T17:31:03.202019615Z" level=info msg="StopPodSandbox for \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" returns successfully" Mar 17 17:31:03.202375 containerd[1538]: time="2025-03-17T17:31:03.202286655Z" level=info msg="RemovePodSandbox for \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\"" Mar 17 17:31:03.202375 containerd[1538]: time="2025-03-17T17:31:03.202312215Z" level=info msg="Forcibly stopping sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\"" Mar 17 17:31:03.202375 containerd[1538]: time="2025-03-17T17:31:03.202380695Z" level=info msg="TearDown network for sandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" successfully" Mar 17 17:31:03.206952 containerd[1538]: time="2025-03-17T17:31:03.206900302Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.207079 containerd[1538]: time="2025-03-17T17:31:03.206995663Z" level=info msg="RemovePodSandbox \"94aefa34997bcf36f16f3b4e2a0722702a380deb1053c519372963bb133a34c1\" returns successfully" Mar 17 17:31:03.207822 containerd[1538]: time="2025-03-17T17:31:03.207647344Z" level=info msg="StopPodSandbox for \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\"" Mar 17 17:31:03.207822 containerd[1538]: time="2025-03-17T17:31:03.207747544Z" level=info msg="TearDown network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\" successfully" Mar 17 17:31:03.207822 containerd[1538]: time="2025-03-17T17:31:03.207757824Z" level=info msg="StopPodSandbox for \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\" returns successfully" Mar 17 17:31:03.208706 containerd[1538]: time="2025-03-17T17:31:03.208649585Z" level=info msg="RemovePodSandbox for \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\"" Mar 17 17:31:03.208706 containerd[1538]: time="2025-03-17T17:31:03.208697305Z" level=info msg="Forcibly stopping sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\"" Mar 17 17:31:03.208812 containerd[1538]: time="2025-03-17T17:31:03.208791945Z" level=info msg="TearDown network for sandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\" successfully" Mar 17 17:31:03.211830 containerd[1538]: time="2025-03-17T17:31:03.211767990Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.211944 containerd[1538]: time="2025-03-17T17:31:03.211875950Z" level=info msg="RemovePodSandbox \"ba3b2fdccd1aa941bdcd5e7d6fa2bb5f799c8764b0d174ee2e620efe1e12e3e4\" returns successfully" Mar 17 17:31:03.212378 containerd[1538]: time="2025-03-17T17:31:03.212351551Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:31:03.212780 containerd[1538]: time="2025-03-17T17:31:03.212610791Z" level=info msg="TearDown network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" successfully" Mar 17 17:31:03.212780 containerd[1538]: time="2025-03-17T17:31:03.212627511Z" level=info msg="StopPodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" returns successfully" Mar 17 17:31:03.218373 containerd[1538]: time="2025-03-17T17:31:03.217983840Z" level=info msg="RemovePodSandbox for \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:31:03.218373 containerd[1538]: time="2025-03-17T17:31:03.218031160Z" level=info msg="Forcibly stopping sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\"" Mar 17 17:31:03.218373 containerd[1538]: time="2025-03-17T17:31:03.218118080Z" level=info msg="TearDown network for sandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" successfully" Mar 17 17:31:03.220829 containerd[1538]: time="2025-03-17T17:31:03.220770124Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.220945 containerd[1538]: time="2025-03-17T17:31:03.220851684Z" level=info msg="RemovePodSandbox \"8daeaca5f53cb063bbb14b279519c09b7d94e6f60402cf24e086c0d1215d2120\" returns successfully" Mar 17 17:31:03.221917 containerd[1538]: time="2025-03-17T17:31:03.221279605Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" Mar 17 17:31:03.221917 containerd[1538]: time="2025-03-17T17:31:03.221372285Z" level=info msg="TearDown network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" successfully" Mar 17 17:31:03.221917 containerd[1538]: time="2025-03-17T17:31:03.221382965Z" level=info msg="StopPodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" returns successfully" Mar 17 17:31:03.224308 containerd[1538]: time="2025-03-17T17:31:03.224269290Z" level=info msg="RemovePodSandbox for \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" Mar 17 17:31:03.224308 containerd[1538]: time="2025-03-17T17:31:03.224308690Z" level=info msg="Forcibly stopping sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\"" Mar 17 17:31:03.224424 containerd[1538]: time="2025-03-17T17:31:03.224392170Z" level=info msg="TearDown network for sandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" successfully" Mar 17 17:31:03.230679 containerd[1538]: time="2025-03-17T17:31:03.230625740Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.230802 containerd[1538]: time="2025-03-17T17:31:03.230709660Z" level=info msg="RemovePodSandbox \"e3365088e9fcf81a3740fd7c797181e58d602228ff5ae5a7aacd6786769a1be5\" returns successfully" Mar 17 17:31:03.233948 containerd[1538]: time="2025-03-17T17:31:03.232282302Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\"" Mar 17 17:31:03.233948 containerd[1538]: time="2025-03-17T17:31:03.232393822Z" level=info msg="TearDown network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" successfully" Mar 17 17:31:03.233948 containerd[1538]: time="2025-03-17T17:31:03.232404662Z" level=info msg="StopPodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" returns successfully" Mar 17 17:31:03.234785 containerd[1538]: time="2025-03-17T17:31:03.234326665Z" level=info msg="RemovePodSandbox for \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\"" Mar 17 17:31:03.234785 containerd[1538]: time="2025-03-17T17:31:03.234363226Z" level=info msg="Forcibly stopping sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\"" Mar 17 17:31:03.234785 containerd[1538]: time="2025-03-17T17:31:03.234437266Z" level=info msg="TearDown network for sandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" successfully" Mar 17 17:31:03.243914 containerd[1538]: time="2025-03-17T17:31:03.242415038Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.243914 containerd[1538]: time="2025-03-17T17:31:03.242550998Z" level=info msg="RemovePodSandbox \"33577c4742509b066e158cb2249c581a882a94385dc71d8f24f8508eaa6ef134\" returns successfully" Mar 17 17:31:03.243914 containerd[1538]: time="2025-03-17T17:31:03.243227399Z" level=info msg="StopPodSandbox for \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\"" Mar 17 17:31:03.243914 containerd[1538]: time="2025-03-17T17:31:03.243362760Z" level=info msg="TearDown network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" successfully" Mar 17 17:31:03.243914 containerd[1538]: time="2025-03-17T17:31:03.243374600Z" level=info msg="StopPodSandbox for \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" returns successfully" Mar 17 17:31:03.245259 containerd[1538]: time="2025-03-17T17:31:03.245222563Z" level=info msg="RemovePodSandbox for \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\"" Mar 17 17:31:03.245386 containerd[1538]: time="2025-03-17T17:31:03.245371523Z" level=info msg="Forcibly stopping sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\"" Mar 17 17:31:03.245519 containerd[1538]: time="2025-03-17T17:31:03.245503123Z" level=info msg="TearDown network for sandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" successfully" Mar 17 17:31:03.248728 containerd[1538]: time="2025-03-17T17:31:03.248682408Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.248964 containerd[1538]: time="2025-03-17T17:31:03.248933448Z" level=info msg="RemovePodSandbox \"5210c93e4ec8b771ae9cff952ec2b248e1229ea8f3dd5172d07487d37712d59c\" returns successfully" Mar 17 17:31:03.249598 containerd[1538]: time="2025-03-17T17:31:03.249569409Z" level=info msg="StopPodSandbox for \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\"" Mar 17 17:31:03.249799 containerd[1538]: time="2025-03-17T17:31:03.249778930Z" level=info msg="TearDown network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\" successfully" Mar 17 17:31:03.249894 containerd[1538]: time="2025-03-17T17:31:03.249850850Z" level=info msg="StopPodSandbox for \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\" returns successfully" Mar 17 17:31:03.250296 containerd[1538]: time="2025-03-17T17:31:03.250263530Z" level=info msg="RemovePodSandbox for \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\"" Mar 17 17:31:03.250357 containerd[1538]: time="2025-03-17T17:31:03.250301211Z" level=info msg="Forcibly stopping sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\"" Mar 17 17:31:03.250385 containerd[1538]: time="2025-03-17T17:31:03.250371811Z" level=info msg="TearDown network for sandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\" successfully" Mar 17 17:31:03.253350 containerd[1538]: time="2025-03-17T17:31:03.253303815Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.253443 containerd[1538]: time="2025-03-17T17:31:03.253374055Z" level=info msg="RemovePodSandbox \"b30f9485a2c9eec95d30367c7b66faf774882da18012e01cffb781f7d738342a\" returns successfully" Mar 17 17:31:03.253854 containerd[1538]: time="2025-03-17T17:31:03.253818256Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:31:03.254064 containerd[1538]: time="2025-03-17T17:31:03.254042456Z" level=info msg="TearDown network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" successfully" Mar 17 17:31:03.254064 containerd[1538]: time="2025-03-17T17:31:03.254063416Z" level=info msg="StopPodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" returns successfully" Mar 17 17:31:03.254374 containerd[1538]: time="2025-03-17T17:31:03.254351457Z" level=info msg="RemovePodSandbox for \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:31:03.254425 containerd[1538]: time="2025-03-17T17:31:03.254378097Z" level=info msg="Forcibly stopping sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\"" Mar 17 17:31:03.254456 containerd[1538]: time="2025-03-17T17:31:03.254440097Z" level=info msg="TearDown network for sandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" successfully" Mar 17 17:31:03.257318 containerd[1538]: time="2025-03-17T17:31:03.257281461Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.257391 containerd[1538]: time="2025-03-17T17:31:03.257353822Z" level=info msg="RemovePodSandbox \"a76e6552330f50086b737bac9da758210548aabdc357574ee32fd2c8b52e5b89\" returns successfully" Mar 17 17:31:03.257895 containerd[1538]: time="2025-03-17T17:31:03.257723382Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" Mar 17 17:31:03.257895 containerd[1538]: time="2025-03-17T17:31:03.257825462Z" level=info msg="TearDown network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" successfully" Mar 17 17:31:03.257895 containerd[1538]: time="2025-03-17T17:31:03.257836302Z" level=info msg="StopPodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" returns successfully" Mar 17 17:31:03.258177 containerd[1538]: time="2025-03-17T17:31:03.258154103Z" level=info msg="RemovePodSandbox for \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" Mar 17 17:31:03.258207 containerd[1538]: time="2025-03-17T17:31:03.258182223Z" level=info msg="Forcibly stopping sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\"" Mar 17 17:31:03.258272 containerd[1538]: time="2025-03-17T17:31:03.258258023Z" level=info msg="TearDown network for sandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" successfully" Mar 17 17:31:03.261141 containerd[1538]: time="2025-03-17T17:31:03.261097427Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.261227 containerd[1538]: time="2025-03-17T17:31:03.261173908Z" level=info msg="RemovePodSandbox \"5d455b7594ed1fecef1a725d214dd1478ff7fb0ac5b5a85aa958f1e08a62a2f8\" returns successfully" Mar 17 17:31:03.261570 containerd[1538]: time="2025-03-17T17:31:03.261544428Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\"" Mar 17 17:31:03.261674 containerd[1538]: time="2025-03-17T17:31:03.261656668Z" level=info msg="TearDown network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" successfully" Mar 17 17:31:03.261706 containerd[1538]: time="2025-03-17T17:31:03.261673788Z" level=info msg="StopPodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" returns successfully" Mar 17 17:31:03.262062 containerd[1538]: time="2025-03-17T17:31:03.262026269Z" level=info msg="RemovePodSandbox for \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\"" Mar 17 17:31:03.262099 containerd[1538]: time="2025-03-17T17:31:03.262061789Z" level=info msg="Forcibly stopping sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\"" Mar 17 17:31:03.262146 containerd[1538]: time="2025-03-17T17:31:03.262132269Z" level=info msg="TearDown network for sandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" successfully" Mar 17 17:31:03.264540 containerd[1538]: time="2025-03-17T17:31:03.264487513Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.264617 containerd[1538]: time="2025-03-17T17:31:03.264548833Z" level=info msg="RemovePodSandbox \"36734ac41d08eeaa40f53bbfc9349c19fe8ed15758565f2ad71ff5a426b93481\" returns successfully" Mar 17 17:31:03.265012 containerd[1538]: time="2025-03-17T17:31:03.264983954Z" level=info msg="StopPodSandbox for \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\"" Mar 17 17:31:03.265117 containerd[1538]: time="2025-03-17T17:31:03.265087634Z" level=info msg="TearDown network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" successfully" Mar 17 17:31:03.265117 containerd[1538]: time="2025-03-17T17:31:03.265104354Z" level=info msg="StopPodSandbox for \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" returns successfully" Mar 17 17:31:03.265453 containerd[1538]: time="2025-03-17T17:31:03.265398994Z" level=info msg="RemovePodSandbox for \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\"" Mar 17 17:31:03.265487 containerd[1538]: time="2025-03-17T17:31:03.265465074Z" level=info msg="Forcibly stopping sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\"" Mar 17 17:31:03.265570 containerd[1538]: time="2025-03-17T17:31:03.265553514Z" level=info msg="TearDown network for sandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" successfully" Mar 17 17:31:03.267963 containerd[1538]: time="2025-03-17T17:31:03.267922718Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.268022 containerd[1538]: time="2025-03-17T17:31:03.267995798Z" level=info msg="RemovePodSandbox \"5938e1a955891a2288e16002198e2c91ae7846480475e8458b631b1050b3a921\" returns successfully" Mar 17 17:31:03.268417 containerd[1538]: time="2025-03-17T17:31:03.268392159Z" level=info msg="StopPodSandbox for \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\"" Mar 17 17:31:03.268508 containerd[1538]: time="2025-03-17T17:31:03.268493119Z" level=info msg="TearDown network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\" successfully" Mar 17 17:31:03.268540 containerd[1538]: time="2025-03-17T17:31:03.268506999Z" level=info msg="StopPodSandbox for \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\" returns successfully" Mar 17 17:31:03.270154 containerd[1538]: time="2025-03-17T17:31:03.268842280Z" level=info msg="RemovePodSandbox for \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\"" Mar 17 17:31:03.270154 containerd[1538]: time="2025-03-17T17:31:03.268903400Z" level=info msg="Forcibly stopping sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\"" Mar 17 17:31:03.270154 containerd[1538]: time="2025-03-17T17:31:03.268983240Z" level=info msg="TearDown network for sandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\" successfully" Mar 17 17:31:03.271488 containerd[1538]: time="2025-03-17T17:31:03.271449684Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.271621 containerd[1538]: time="2025-03-17T17:31:03.271602204Z" level=info msg="RemovePodSandbox \"4c6e47449b10bcdbb6af5e96ad9d2784d28d1ead6dcdfe800e86a7d2c95bdf94\" returns successfully" Mar 17 17:31:03.272112 containerd[1538]: time="2025-03-17T17:31:03.272085725Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:31:03.272198 containerd[1538]: time="2025-03-17T17:31:03.272183005Z" level=info msg="TearDown network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" successfully" Mar 17 17:31:03.272198 containerd[1538]: time="2025-03-17T17:31:03.272197525Z" level=info msg="StopPodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" returns successfully" Mar 17 17:31:03.272591 containerd[1538]: time="2025-03-17T17:31:03.272554365Z" level=info msg="RemovePodSandbox for \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:31:03.272706 containerd[1538]: time="2025-03-17T17:31:03.272611006Z" level=info msg="Forcibly stopping sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\"" Mar 17 17:31:03.272706 containerd[1538]: time="2025-03-17T17:31:03.272671446Z" level=info msg="TearDown network for sandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" successfully" Mar 17 17:31:03.275100 containerd[1538]: time="2025-03-17T17:31:03.275064609Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.275180 containerd[1538]: time="2025-03-17T17:31:03.275129050Z" level=info msg="RemovePodSandbox \"1ff4508db0199a536723ac9cc0d067a5620173549eb0258f05dec8819c7eb91e\" returns successfully" Mar 17 17:31:03.275489 containerd[1538]: time="2025-03-17T17:31:03.275460170Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" Mar 17 17:31:03.275568 containerd[1538]: time="2025-03-17T17:31:03.275552770Z" level=info msg="TearDown network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" successfully" Mar 17 17:31:03.275596 containerd[1538]: time="2025-03-17T17:31:03.275567330Z" level=info msg="StopPodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" returns successfully" Mar 17 17:31:03.275919 containerd[1538]: time="2025-03-17T17:31:03.275893371Z" level=info msg="RemovePodSandbox for \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" Mar 17 17:31:03.276907 containerd[1538]: time="2025-03-17T17:31:03.275996171Z" level=info msg="Forcibly stopping sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\"" Mar 17 17:31:03.276907 containerd[1538]: time="2025-03-17T17:31:03.276076131Z" level=info msg="TearDown network for sandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" successfully" Mar 17 17:31:03.278503 containerd[1538]: time="2025-03-17T17:31:03.278462495Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.278555 containerd[1538]: time="2025-03-17T17:31:03.278535975Z" level=info msg="RemovePodSandbox \"02083a2c86a941e92af0f2738328cb6dbd7ef771086b18c28a5b7b3f0b9e0128\" returns successfully" Mar 17 17:31:03.279006 containerd[1538]: time="2025-03-17T17:31:03.278930055Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\"" Mar 17 17:31:03.279098 containerd[1538]: time="2025-03-17T17:31:03.279076216Z" level=info msg="TearDown network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" successfully" Mar 17 17:31:03.279098 containerd[1538]: time="2025-03-17T17:31:03.279091016Z" level=info msg="StopPodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" returns successfully" Mar 17 17:31:03.279423 containerd[1538]: time="2025-03-17T17:31:03.279400536Z" level=info msg="RemovePodSandbox for \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\"" Mar 17 17:31:03.279473 containerd[1538]: time="2025-03-17T17:31:03.279428856Z" level=info msg="Forcibly stopping sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\"" Mar 17 17:31:03.279509 containerd[1538]: time="2025-03-17T17:31:03.279494016Z" level=info msg="TearDown network for sandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" successfully" Mar 17 17:31:03.282120 containerd[1538]: time="2025-03-17T17:31:03.282080100Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.282221 containerd[1538]: time="2025-03-17T17:31:03.282146341Z" level=info msg="RemovePodSandbox \"cda3e3de9378d6e78eb85977f9cd1c9c35cf0044788da47b4996cda8fe658a31\" returns successfully" Mar 17 17:31:03.282505 containerd[1538]: time="2025-03-17T17:31:03.282484781Z" level=info msg="StopPodSandbox for \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\"" Mar 17 17:31:03.282611 containerd[1538]: time="2025-03-17T17:31:03.282595741Z" level=info msg="TearDown network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" successfully" Mar 17 17:31:03.282611 containerd[1538]: time="2025-03-17T17:31:03.282609021Z" level=info msg="StopPodSandbox for \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" returns successfully" Mar 17 17:31:03.282888 containerd[1538]: time="2025-03-17T17:31:03.282843462Z" level=info msg="RemovePodSandbox for \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\"" Mar 17 17:31:03.282888 containerd[1538]: time="2025-03-17T17:31:03.282879622Z" level=info msg="Forcibly stopping sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\"" Mar 17 17:31:03.282986 containerd[1538]: time="2025-03-17T17:31:03.282936062Z" level=info msg="TearDown network for sandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" successfully" Mar 17 17:31:03.290861 containerd[1538]: time="2025-03-17T17:31:03.290820394Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.290989 containerd[1538]: time="2025-03-17T17:31:03.290931354Z" level=info msg="RemovePodSandbox \"af9c2eb3974760717f092aa5746d5d0b28c79070c600230214a26e08a9c8f4b7\" returns successfully" Mar 17 17:31:03.291348 containerd[1538]: time="2025-03-17T17:31:03.291320915Z" level=info msg="StopPodSandbox for \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\"" Mar 17 17:31:03.291446 containerd[1538]: time="2025-03-17T17:31:03.291430635Z" level=info msg="TearDown network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\" successfully" Mar 17 17:31:03.291446 containerd[1538]: time="2025-03-17T17:31:03.291444515Z" level=info msg="StopPodSandbox for \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\" returns successfully" Mar 17 17:31:03.292762 containerd[1538]: time="2025-03-17T17:31:03.291718516Z" level=info msg="RemovePodSandbox for \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\"" Mar 17 17:31:03.292762 containerd[1538]: time="2025-03-17T17:31:03.291750276Z" level=info msg="Forcibly stopping sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\"" Mar 17 17:31:03.292762 containerd[1538]: time="2025-03-17T17:31:03.291814396Z" level=info msg="TearDown network for sandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\" successfully" Mar 17 17:31:03.294431 containerd[1538]: time="2025-03-17T17:31:03.294394800Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.294496 containerd[1538]: time="2025-03-17T17:31:03.294456040Z" level=info msg="RemovePodSandbox \"70d438aeb8e1d5490859c31c6276e880bfa2b42bebbf0102325f2730a59a7942\" returns successfully" Mar 17 17:31:03.294939 containerd[1538]: time="2025-03-17T17:31:03.294908481Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:31:03.295031 containerd[1538]: time="2025-03-17T17:31:03.295017161Z" level=info msg="TearDown network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" successfully" Mar 17 17:31:03.295072 containerd[1538]: time="2025-03-17T17:31:03.295030601Z" level=info msg="StopPodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" returns successfully" Mar 17 17:31:03.295352 containerd[1538]: time="2025-03-17T17:31:03.295330241Z" level=info msg="RemovePodSandbox for \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:31:03.295388 containerd[1538]: time="2025-03-17T17:31:03.295360081Z" level=info msg="Forcibly stopping sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\"" Mar 17 17:31:03.295438 containerd[1538]: time="2025-03-17T17:31:03.295424361Z" level=info msg="TearDown network for sandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" successfully" Mar 17 17:31:03.298048 containerd[1538]: time="2025-03-17T17:31:03.298003365Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.298141 containerd[1538]: time="2025-03-17T17:31:03.298081846Z" level=info msg="RemovePodSandbox \"6186e6c97f3101e244d8cc3fafd08de8e3a22840b662c54a68acc32f1029f832\" returns successfully" Mar 17 17:31:03.298501 containerd[1538]: time="2025-03-17T17:31:03.298474486Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" Mar 17 17:31:03.298755 containerd[1538]: time="2025-03-17T17:31:03.298673887Z" level=info msg="TearDown network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" successfully" Mar 17 17:31:03.298755 containerd[1538]: time="2025-03-17T17:31:03.298691527Z" level=info msg="StopPodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" returns successfully" Mar 17 17:31:03.299083 containerd[1538]: time="2025-03-17T17:31:03.299006327Z" level=info msg="RemovePodSandbox for \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" Mar 17 17:31:03.299083 containerd[1538]: time="2025-03-17T17:31:03.299040167Z" level=info msg="Forcibly stopping sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\"" Mar 17 17:31:03.299137 containerd[1538]: time="2025-03-17T17:31:03.299116287Z" level=info msg="TearDown network for sandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" successfully" Mar 17 17:31:03.301636 containerd[1538]: time="2025-03-17T17:31:03.301599131Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.301698 containerd[1538]: time="2025-03-17T17:31:03.301669851Z" level=info msg="RemovePodSandbox \"101df8b738de8f174f0ca449d895956cb733ffb9876566b16769cadf03556688\" returns successfully" Mar 17 17:31:03.302068 containerd[1538]: time="2025-03-17T17:31:03.302047972Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\"" Mar 17 17:31:03.302161 containerd[1538]: time="2025-03-17T17:31:03.302147492Z" level=info msg="TearDown network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" successfully" Mar 17 17:31:03.302161 containerd[1538]: time="2025-03-17T17:31:03.302160812Z" level=info msg="StopPodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" returns successfully" Mar 17 17:31:03.302413 containerd[1538]: time="2025-03-17T17:31:03.302390652Z" level=info msg="RemovePodSandbox for \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\"" Mar 17 17:31:03.302465 containerd[1538]: time="2025-03-17T17:31:03.302420212Z" level=info msg="Forcibly stopping sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\"" Mar 17 17:31:03.302493 containerd[1538]: time="2025-03-17T17:31:03.302485172Z" level=info msg="TearDown network for sandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" successfully" Mar 17 17:31:03.305146 containerd[1538]: time="2025-03-17T17:31:03.305081217Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.305338 containerd[1538]: time="2025-03-17T17:31:03.305173177Z" level=info msg="RemovePodSandbox \"253485c169e207497fb780a5586f05a3070b309c791cf0aba017eba08935fe7c\" returns successfully" Mar 17 17:31:03.305801 containerd[1538]: time="2025-03-17T17:31:03.305629617Z" level=info msg="StopPodSandbox for \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\"" Mar 17 17:31:03.305801 containerd[1538]: time="2025-03-17T17:31:03.305723778Z" level=info msg="TearDown network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" successfully" Mar 17 17:31:03.305801 containerd[1538]: time="2025-03-17T17:31:03.305733538Z" level=info msg="StopPodSandbox for \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" returns successfully" Mar 17 17:31:03.306051 containerd[1538]: time="2025-03-17T17:31:03.306024738Z" level=info msg="RemovePodSandbox for \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\"" Mar 17 17:31:03.306083 containerd[1538]: time="2025-03-17T17:31:03.306058098Z" level=info msg="Forcibly stopping sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\"" Mar 17 17:31:03.306140 containerd[1538]: time="2025-03-17T17:31:03.306125778Z" level=info msg="TearDown network for sandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" successfully" Mar 17 17:31:03.314078 containerd[1538]: time="2025-03-17T17:31:03.314035071Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.314189 containerd[1538]: time="2025-03-17T17:31:03.314104631Z" level=info msg="RemovePodSandbox \"a0a22c63f656d291c6afbb062d26401d59c89ae753fc33cc84818b5b04033bdb\" returns successfully" Mar 17 17:31:03.314909 containerd[1538]: time="2025-03-17T17:31:03.314773952Z" level=info msg="StopPodSandbox for \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\"" Mar 17 17:31:03.314909 containerd[1538]: time="2025-03-17T17:31:03.314897952Z" level=info msg="TearDown network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\" successfully" Mar 17 17:31:03.314909 containerd[1538]: time="2025-03-17T17:31:03.314911072Z" level=info msg="StopPodSandbox for \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\" returns successfully" Mar 17 17:31:03.315301 containerd[1538]: time="2025-03-17T17:31:03.315274873Z" level=info msg="RemovePodSandbox for \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\"" Mar 17 17:31:03.315354 containerd[1538]: time="2025-03-17T17:31:03.315335913Z" level=info msg="Forcibly stopping sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\"" Mar 17 17:31:03.315450 containerd[1538]: time="2025-03-17T17:31:03.315430473Z" level=info msg="TearDown network for sandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\" successfully" Mar 17 17:31:03.332181 containerd[1538]: time="2025-03-17T17:31:03.332129139Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.332305 containerd[1538]: time="2025-03-17T17:31:03.332268339Z" level=info msg="RemovePodSandbox \"17c85ca66e267b237f74791b6bda05ea742f16161256f73d115a4046e0fa1be7\" returns successfully" Mar 17 17:31:03.332888 containerd[1538]: time="2025-03-17T17:31:03.332833620Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:31:03.333002 containerd[1538]: time="2025-03-17T17:31:03.332990500Z" level=info msg="TearDown network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" successfully" Mar 17 17:31:03.333039 containerd[1538]: time="2025-03-17T17:31:03.333003820Z" level=info msg="StopPodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" returns successfully" Mar 17 17:31:03.333349 containerd[1538]: time="2025-03-17T17:31:03.333305501Z" level=info msg="RemovePodSandbox for \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:31:03.333349 containerd[1538]: time="2025-03-17T17:31:03.333336981Z" level=info msg="Forcibly stopping sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\"" Mar 17 17:31:03.333422 containerd[1538]: time="2025-03-17T17:31:03.333397741Z" level=info msg="TearDown network for sandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" successfully" Mar 17 17:31:03.336491 containerd[1538]: time="2025-03-17T17:31:03.336432586Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.336593 containerd[1538]: time="2025-03-17T17:31:03.336542106Z" level=info msg="RemovePodSandbox \"cc2e31f10c156c4ce6f4b03cac8b72c1b1c48577ccdf1d011796ae0b7de433d1\" returns successfully" Mar 17 17:31:03.337121 containerd[1538]: time="2025-03-17T17:31:03.337049627Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" Mar 17 17:31:03.337215 containerd[1538]: time="2025-03-17T17:31:03.337182627Z" level=info msg="TearDown network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" successfully" Mar 17 17:31:03.337215 containerd[1538]: time="2025-03-17T17:31:03.337194587Z" level=info msg="StopPodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" returns successfully" Mar 17 17:31:03.337718 containerd[1538]: time="2025-03-17T17:31:03.337684908Z" level=info msg="RemovePodSandbox for \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" Mar 17 17:31:03.337718 containerd[1538]: time="2025-03-17T17:31:03.337717468Z" level=info msg="Forcibly stopping sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\"" Mar 17 17:31:03.337925 containerd[1538]: time="2025-03-17T17:31:03.337903748Z" level=info msg="TearDown network for sandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" successfully" Mar 17 17:31:03.340780 containerd[1538]: time="2025-03-17T17:31:03.340736633Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.340848 containerd[1538]: time="2025-03-17T17:31:03.340801473Z" level=info msg="RemovePodSandbox \"2016a5ac2f83d734a4dd10539acfc4903a2ed12a5066ef995902dcdab68d4544\" returns successfully" Mar 17 17:31:03.341298 containerd[1538]: time="2025-03-17T17:31:03.341257633Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\"" Mar 17 17:31:03.341368 containerd[1538]: time="2025-03-17T17:31:03.341346474Z" level=info msg="TearDown network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" successfully" Mar 17 17:31:03.341368 containerd[1538]: time="2025-03-17T17:31:03.341356634Z" level=info msg="StopPodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" returns successfully" Mar 17 17:31:03.341787 containerd[1538]: time="2025-03-17T17:31:03.341761074Z" level=info msg="RemovePodSandbox for \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\"" Mar 17 17:31:03.341915 containerd[1538]: time="2025-03-17T17:31:03.341792554Z" level=info msg="Forcibly stopping sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\"" Mar 17 17:31:03.341915 containerd[1538]: time="2025-03-17T17:31:03.341851434Z" level=info msg="TearDown network for sandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" successfully" Mar 17 17:31:03.344530 containerd[1538]: time="2025-03-17T17:31:03.344480718Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.344614 containerd[1538]: time="2025-03-17T17:31:03.344552359Z" level=info msg="RemovePodSandbox \"463dce21b0068e0a6151c979ce4ee054d0c9994124ca03b902efce57bcb4def5\" returns successfully" Mar 17 17:31:03.345032 containerd[1538]: time="2025-03-17T17:31:03.344997439Z" level=info msg="StopPodSandbox for \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\"" Mar 17 17:31:03.345155 containerd[1538]: time="2025-03-17T17:31:03.345138079Z" level=info msg="TearDown network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" successfully" Mar 17 17:31:03.345218 containerd[1538]: time="2025-03-17T17:31:03.345155240Z" level=info msg="StopPodSandbox for \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" returns successfully" Mar 17 17:31:03.345537 containerd[1538]: time="2025-03-17T17:31:03.345514280Z" level=info msg="RemovePodSandbox for \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\"" Mar 17 17:31:03.345575 containerd[1538]: time="2025-03-17T17:31:03.345543560Z" level=info msg="Forcibly stopping sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\"" Mar 17 17:31:03.345620 containerd[1538]: time="2025-03-17T17:31:03.345604320Z" level=info msg="TearDown network for sandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" successfully" Mar 17 17:31:03.360839 containerd[1538]: time="2025-03-17T17:31:03.348178764Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.360974 containerd[1538]: time="2025-03-17T17:31:03.360899184Z" level=info msg="RemovePodSandbox \"d4f4f647df602477c8c9d69bcabca6725ac015e965372209b06df39e9bf3b5b9\" returns successfully" Mar 17 17:31:03.361487 containerd[1538]: time="2025-03-17T17:31:03.361456345Z" level=info msg="StopPodSandbox for \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\"" Mar 17 17:31:03.361595 containerd[1538]: time="2025-03-17T17:31:03.361575385Z" level=info msg="TearDown network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\" successfully" Mar 17 17:31:03.361595 containerd[1538]: time="2025-03-17T17:31:03.361592345Z" level=info msg="StopPodSandbox for \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\" returns successfully" Mar 17 17:31:03.361953 containerd[1538]: time="2025-03-17T17:31:03.361929346Z" level=info msg="RemovePodSandbox for \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\"" Mar 17 17:31:03.362005 containerd[1538]: time="2025-03-17T17:31:03.361965546Z" level=info msg="Forcibly stopping sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\"" Mar 17 17:31:03.362066 containerd[1538]: time="2025-03-17T17:31:03.362037546Z" level=info msg="TearDown network for sandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\" successfully" Mar 17 17:31:03.364828 containerd[1538]: time="2025-03-17T17:31:03.364792830Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:31:03.364934 containerd[1538]: time="2025-03-17T17:31:03.364874150Z" level=info msg="RemovePodSandbox \"d1e7413063dd9d241056dfb451e16a35355e0cbddd7e742bfa5d131908062bb1\" returns successfully" Mar 17 17:31:06.740353 systemd[1]: Started sshd@19-10.0.0.79:22-10.0.0.1:56026.service - OpenSSH per-connection server daemon (10.0.0.1:56026). Mar 17 17:31:06.780457 sshd[6097]: Accepted publickey for core from 10.0.0.1 port 56026 ssh2: RSA SHA256:XEsN/dc1y+7MY2pZiPvPM9E3FANLWuBR2AC7g0KqjmQ Mar 17 17:31:06.781982 sshd-session[6097]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:31:06.785650 systemd-logind[1523]: New session 20 of user core. Mar 17 17:31:06.795269 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 17 17:31:06.926907 sshd[6100]: Connection closed by 10.0.0.1 port 56026 Mar 17 17:31:06.927182 sshd-session[6097]: pam_unix(sshd:session): session closed for user core Mar 17 17:31:06.930574 systemd[1]: sshd@19-10.0.0.79:22-10.0.0.1:56026.service: Deactivated successfully. Mar 17 17:31:06.933384 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 17:31:06.933697 systemd-logind[1523]: Session 20 logged out. Waiting for processes to exit. Mar 17 17:31:06.935464 systemd-logind[1523]: Removed session 20.