Mar 17 17:48:34.975538 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 17 17:48:34.975560 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT Mon Mar 17 16:11:40 -00 2025 Mar 17 17:48:34.975569 kernel: KASLR enabled Mar 17 17:48:34.975575 kernel: efi: EFI v2.7 by EDK II Mar 17 17:48:34.975580 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdbbae018 ACPI 2.0=0xd9b43018 RNG=0xd9b43a18 MEMRESERVE=0xd9b40218 Mar 17 17:48:34.975586 kernel: random: crng init done Mar 17 17:48:34.975592 kernel: secureboot: Secure boot disabled Mar 17 17:48:34.975598 kernel: ACPI: Early table checksum verification disabled Mar 17 17:48:34.975604 kernel: ACPI: RSDP 0x00000000D9B43018 000024 (v02 BOCHS ) Mar 17 17:48:34.975611 kernel: ACPI: XSDT 0x00000000D9B43F18 000064 (v01 BOCHS BXPC 00000001 01000013) Mar 17 17:48:34.975617 kernel: ACPI: FACP 0x00000000D9B43B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975622 kernel: ACPI: DSDT 0x00000000D9B41018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975628 kernel: ACPI: APIC 0x00000000D9B43C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975634 kernel: ACPI: PPTT 0x00000000D9B43098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975641 kernel: ACPI: GTDT 0x00000000D9B43818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975648 kernel: ACPI: MCFG 0x00000000D9B43A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975654 kernel: ACPI: SPCR 0x00000000D9B43918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975660 kernel: ACPI: DBG2 0x00000000D9B43998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975666 kernel: ACPI: IORT 0x00000000D9B43198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 17:48:34.975672 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Mar 17 17:48:34.975678 kernel: NUMA: Failed to initialise from firmware Mar 17 17:48:34.975684 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Mar 17 17:48:34.975690 kernel: NUMA: NODE_DATA [mem 0xdc958800-0xdc95dfff] Mar 17 17:48:34.975696 kernel: Zone ranges: Mar 17 17:48:34.975702 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Mar 17 17:48:34.975709 kernel: DMA32 empty Mar 17 17:48:34.975715 kernel: Normal empty Mar 17 17:48:34.975721 kernel: Movable zone start for each node Mar 17 17:48:34.975727 kernel: Early memory node ranges Mar 17 17:48:34.975733 kernel: node 0: [mem 0x0000000040000000-0x00000000d967ffff] Mar 17 17:48:34.975739 kernel: node 0: [mem 0x00000000d9680000-0x00000000d968ffff] Mar 17 17:48:34.975745 kernel: node 0: [mem 0x00000000d9690000-0x00000000d976ffff] Mar 17 17:48:34.975751 kernel: node 0: [mem 0x00000000d9770000-0x00000000d9b3ffff] Mar 17 17:48:34.975758 kernel: node 0: [mem 0x00000000d9b40000-0x00000000dce1ffff] Mar 17 17:48:34.975764 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Mar 17 17:48:34.975769 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Mar 17 17:48:34.975776 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Mar 17 17:48:34.975784 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Mar 17 17:48:34.975790 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Mar 17 17:48:34.975801 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Mar 17 17:48:34.975810 kernel: psci: probing for conduit method from ACPI. Mar 17 17:48:34.975820 kernel: psci: PSCIv1.1 detected in firmware. Mar 17 17:48:34.975826 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 17:48:34.975834 kernel: psci: Trusted OS migration not required Mar 17 17:48:34.975841 kernel: psci: SMC Calling Convention v1.1 Mar 17 17:48:34.975847 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 17 17:48:34.976021 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 17 17:48:34.976069 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 17 17:48:34.976113 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 17 17:48:34.976120 kernel: Detected PIPT I-cache on CPU0 Mar 17 17:48:34.976126 kernel: CPU features: detected: GIC system register CPU interface Mar 17 17:48:34.976133 kernel: CPU features: detected: Hardware dirty bit management Mar 17 17:48:34.976139 kernel: CPU features: detected: Spectre-v4 Mar 17 17:48:34.976150 kernel: CPU features: detected: Spectre-BHB Mar 17 17:48:34.976157 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 17 17:48:34.976163 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 17 17:48:34.976170 kernel: CPU features: detected: ARM erratum 1418040 Mar 17 17:48:34.976176 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 17 17:48:34.976183 kernel: alternatives: applying boot alternatives Mar 17 17:48:34.976190 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=f8298a09e890fc732131b7281e24befaf65b596eb5216e969c8eca4cab4a2b3a Mar 17 17:48:34.976198 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 17:48:34.976204 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 17:48:34.976211 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 17:48:34.976217 kernel: Fallback order for Node 0: 0 Mar 17 17:48:34.976225 kernel: Built 1 zonelists, mobility grouping on. Total pages: 633024 Mar 17 17:48:34.976231 kernel: Policy zone: DMA Mar 17 17:48:34.976237 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 17:48:34.976244 kernel: software IO TLB: area num 4. Mar 17 17:48:34.976250 kernel: software IO TLB: mapped [mem 0x00000000d2e00000-0x00000000d6e00000] (64MB) Mar 17 17:48:34.976257 kernel: Memory: 2387540K/2572288K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38336K init, 897K bss, 184748K reserved, 0K cma-reserved) Mar 17 17:48:34.976263 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 17 17:48:34.976270 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 17:48:34.976277 kernel: rcu: RCU event tracing is enabled. Mar 17 17:48:34.976283 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 17 17:48:34.976290 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 17:48:34.976296 kernel: Tracing variant of Tasks RCU enabled. Mar 17 17:48:34.976304 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 17:48:34.976311 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 17 17:48:34.976317 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 17:48:34.976323 kernel: GICv3: 256 SPIs implemented Mar 17 17:48:34.976330 kernel: GICv3: 0 Extended SPIs implemented Mar 17 17:48:34.976336 kernel: Root IRQ handler: gic_handle_irq Mar 17 17:48:34.976342 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 17 17:48:34.976349 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 17 17:48:34.976355 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 17 17:48:34.976362 kernel: ITS@0x0000000008080000: allocated 8192 Devices @400c0000 (indirect, esz 8, psz 64K, shr 1) Mar 17 17:48:34.976368 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @400d0000 (flat, esz 8, psz 64K, shr 1) Mar 17 17:48:34.976376 kernel: GICv3: using LPI property table @0x00000000400f0000 Mar 17 17:48:34.976382 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040100000 Mar 17 17:48:34.976389 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 17 17:48:34.976395 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:48:34.976401 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 17 17:48:34.976408 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 17 17:48:34.976414 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 17 17:48:34.976421 kernel: arm-pv: using stolen time PV Mar 17 17:48:34.976428 kernel: Console: colour dummy device 80x25 Mar 17 17:48:34.976434 kernel: ACPI: Core revision 20230628 Mar 17 17:48:34.976441 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 17 17:48:34.976449 kernel: pid_max: default: 32768 minimum: 301 Mar 17 17:48:34.976456 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 17 17:48:34.976462 kernel: landlock: Up and running. Mar 17 17:48:34.976469 kernel: SELinux: Initializing. Mar 17 17:48:34.976476 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:48:34.976482 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 17:48:34.976489 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 17 17:48:34.976496 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 17 17:48:34.976502 kernel: rcu: Hierarchical SRCU implementation. Mar 17 17:48:34.976510 kernel: rcu: Max phase no-delay instances is 400. Mar 17 17:48:34.976517 kernel: Platform MSI: ITS@0x8080000 domain created Mar 17 17:48:34.976524 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 17 17:48:34.976530 kernel: Remapping and enabling EFI services. Mar 17 17:48:34.976537 kernel: smp: Bringing up secondary CPUs ... Mar 17 17:48:34.976543 kernel: Detected PIPT I-cache on CPU1 Mar 17 17:48:34.976550 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 17 17:48:34.976557 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040110000 Mar 17 17:48:34.976564 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:48:34.976572 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 17 17:48:34.976578 kernel: Detected PIPT I-cache on CPU2 Mar 17 17:48:34.976589 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 17 17:48:34.976598 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040120000 Mar 17 17:48:34.976605 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:48:34.976612 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 17 17:48:34.976619 kernel: Detected PIPT I-cache on CPU3 Mar 17 17:48:34.976626 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 17 17:48:34.976633 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040130000 Mar 17 17:48:34.976642 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 17 17:48:34.976648 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 17 17:48:34.976655 kernel: smp: Brought up 1 node, 4 CPUs Mar 17 17:48:34.976662 kernel: SMP: Total of 4 processors activated. Mar 17 17:48:34.976669 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 17:48:34.976676 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 17 17:48:34.976683 kernel: CPU features: detected: Common not Private translations Mar 17 17:48:34.976690 kernel: CPU features: detected: CRC32 instructions Mar 17 17:48:34.976698 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 17 17:48:34.976705 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 17 17:48:34.976712 kernel: CPU features: detected: LSE atomic instructions Mar 17 17:48:34.976719 kernel: CPU features: detected: Privileged Access Never Mar 17 17:48:34.976725 kernel: CPU features: detected: RAS Extension Support Mar 17 17:48:34.976732 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 17 17:48:34.976739 kernel: CPU: All CPU(s) started at EL1 Mar 17 17:48:34.976746 kernel: alternatives: applying system-wide alternatives Mar 17 17:48:34.976753 kernel: devtmpfs: initialized Mar 17 17:48:34.976760 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 17:48:34.976768 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 17 17:48:34.976775 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 17:48:34.976782 kernel: SMBIOS 3.0.0 present. Mar 17 17:48:34.976790 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Mar 17 17:48:34.976797 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 17:48:34.976804 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 17:48:34.976811 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 17:48:34.976831 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 17:48:34.976839 kernel: audit: initializing netlink subsys (disabled) Mar 17 17:48:34.976848 kernel: audit: type=2000 audit(0.020:1): state=initialized audit_enabled=0 res=1 Mar 17 17:48:34.976855 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 17:48:34.976862 kernel: cpuidle: using governor menu Mar 17 17:48:34.976869 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 17:48:34.976876 kernel: ASID allocator initialised with 32768 entries Mar 17 17:48:34.976883 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 17:48:34.976890 kernel: Serial: AMBA PL011 UART driver Mar 17 17:48:34.976897 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 17 17:48:34.976917 kernel: Modules: 0 pages in range for non-PLT usage Mar 17 17:48:34.976928 kernel: Modules: 509280 pages in range for PLT usage Mar 17 17:48:34.976935 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 17:48:34.976942 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 17 17:48:34.976949 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 17:48:34.976956 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 17 17:48:34.976963 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 17:48:34.976971 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 17 17:48:34.976978 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 17:48:34.976985 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 17 17:48:34.976993 kernel: ACPI: Added _OSI(Module Device) Mar 17 17:48:34.977000 kernel: ACPI: Added _OSI(Processor Device) Mar 17 17:48:34.977007 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 17:48:34.977014 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 17:48:34.977020 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 17:48:34.977027 kernel: ACPI: Interpreter enabled Mar 17 17:48:34.977034 kernel: ACPI: Using GIC for interrupt routing Mar 17 17:48:34.977041 kernel: ACPI: MCFG table detected, 1 entries Mar 17 17:48:34.977048 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 17 17:48:34.977056 kernel: printk: console [ttyAMA0] enabled Mar 17 17:48:34.977063 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 17 17:48:34.977227 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 17:48:34.977303 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 17 17:48:34.977367 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 17 17:48:34.977430 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 17 17:48:34.977492 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 17 17:48:34.977504 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 17 17:48:34.977511 kernel: PCI host bridge to bus 0000:00 Mar 17 17:48:34.977582 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 17 17:48:34.977649 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 17 17:48:34.977705 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 17 17:48:34.977765 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 17 17:48:34.977851 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 17 17:48:34.977950 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 Mar 17 17:48:34.978118 kernel: pci 0000:00:01.0: reg 0x10: [io 0x0000-0x001f] Mar 17 17:48:34.978200 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x10000000-0x10000fff] Mar 17 17:48:34.978267 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 17 17:48:34.978332 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 17 17:48:34.978396 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x10000000-0x10000fff] Mar 17 17:48:34.978460 kernel: pci 0000:00:01.0: BAR 0: assigned [io 0x1000-0x101f] Mar 17 17:48:34.978527 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 17 17:48:34.978584 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 17 17:48:34.978640 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 17 17:48:34.978649 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 17 17:48:34.978657 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 17 17:48:34.978663 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 17 17:48:34.978670 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 17 17:48:34.978679 kernel: iommu: Default domain type: Translated Mar 17 17:48:34.978686 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 17:48:34.978693 kernel: efivars: Registered efivars operations Mar 17 17:48:34.978700 kernel: vgaarb: loaded Mar 17 17:48:34.978707 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 17:48:34.978714 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 17:48:34.978721 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 17:48:34.978728 kernel: pnp: PnP ACPI init Mar 17 17:48:34.978805 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 17 17:48:34.978817 kernel: pnp: PnP ACPI: found 1 devices Mar 17 17:48:34.978824 kernel: NET: Registered PF_INET protocol family Mar 17 17:48:34.978831 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 17:48:34.978838 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 17:48:34.978846 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 17:48:34.978853 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 17:48:34.978860 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 17 17:48:34.978867 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 17:48:34.978874 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:48:34.978883 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 17:48:34.978890 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 17:48:34.978897 kernel: PCI: CLS 0 bytes, default 64 Mar 17 17:48:34.978929 kernel: kvm [1]: HYP mode not available Mar 17 17:48:34.978939 kernel: Initialise system trusted keyrings Mar 17 17:48:34.978946 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 17:48:34.978955 kernel: Key type asymmetric registered Mar 17 17:48:34.979042 kernel: Asymmetric key parser 'x509' registered Mar 17 17:48:34.979086 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 17 17:48:34.979101 kernel: io scheduler mq-deadline registered Mar 17 17:48:34.979108 kernel: io scheduler kyber registered Mar 17 17:48:34.979115 kernel: io scheduler bfq registered Mar 17 17:48:34.979122 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 17 17:48:34.979129 kernel: ACPI: button: Power Button [PWRB] Mar 17 17:48:34.979137 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 17 17:48:34.979331 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Mar 17 17:48:34.979345 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 17:48:34.979352 kernel: thunder_xcv, ver 1.0 Mar 17 17:48:34.979364 kernel: thunder_bgx, ver 1.0 Mar 17 17:48:34.979371 kernel: nicpf, ver 1.0 Mar 17 17:48:34.979378 kernel: nicvf, ver 1.0 Mar 17 17:48:34.979455 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 17:48:34.979521 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T17:48:34 UTC (1742233714) Mar 17 17:48:34.979530 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 17:48:34.979611 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 17 17:48:34.979622 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 17 17:48:34.979634 kernel: watchdog: Hard watchdog permanently disabled Mar 17 17:48:34.979641 kernel: NET: Registered PF_INET6 protocol family Mar 17 17:48:34.979648 kernel: Segment Routing with IPv6 Mar 17 17:48:34.979654 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 17:48:34.979661 kernel: NET: Registered PF_PACKET protocol family Mar 17 17:48:34.979668 kernel: Key type dns_resolver registered Mar 17 17:48:34.979675 kernel: registered taskstats version 1 Mar 17 17:48:34.979682 kernel: Loading compiled-in X.509 certificates Mar 17 17:48:34.979689 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: f4ff2820cf7379ce82b759137d15b536f0a99b51' Mar 17 17:48:34.979697 kernel: Key type .fscrypt registered Mar 17 17:48:34.979704 kernel: Key type fscrypt-provisioning registered Mar 17 17:48:34.979711 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 17:48:34.979718 kernel: ima: Allocated hash algorithm: sha1 Mar 17 17:48:34.979725 kernel: ima: No architecture policies found Mar 17 17:48:34.979732 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 17:48:34.979739 kernel: clk: Disabling unused clocks Mar 17 17:48:34.979746 kernel: Freeing unused kernel memory: 38336K Mar 17 17:48:34.979753 kernel: Run /init as init process Mar 17 17:48:34.979761 kernel: with arguments: Mar 17 17:48:34.979768 kernel: /init Mar 17 17:48:34.979775 kernel: with environment: Mar 17 17:48:34.979781 kernel: HOME=/ Mar 17 17:48:34.979788 kernel: TERM=linux Mar 17 17:48:34.979795 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 17:48:34.979803 systemd[1]: Successfully made /usr/ read-only. Mar 17 17:48:34.979813 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 17:48:34.979822 systemd[1]: Detected virtualization kvm. Mar 17 17:48:34.979830 systemd[1]: Detected architecture arm64. Mar 17 17:48:34.979837 systemd[1]: Running in initrd. Mar 17 17:48:34.979844 systemd[1]: No hostname configured, using default hostname. Mar 17 17:48:34.979851 systemd[1]: Hostname set to . Mar 17 17:48:34.979859 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:48:34.979866 systemd[1]: Queued start job for default target initrd.target. Mar 17 17:48:34.979874 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:48:34.979883 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:48:34.979891 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 17 17:48:34.979899 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:48:34.979931 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 17 17:48:34.979940 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 17 17:48:34.979949 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 17 17:48:34.979959 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 17 17:48:34.979967 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:48:34.979974 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:48:34.979982 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:48:34.979989 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:48:34.979997 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:48:34.980004 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:48:34.980012 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:48:34.980019 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:48:34.980028 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 17 17:48:34.980036 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 17 17:48:34.980043 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:48:34.980051 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:48:34.980058 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:48:34.980066 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:48:34.980073 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 17 17:48:34.980081 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:48:34.980090 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 17 17:48:34.980097 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 17:48:34.980104 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:48:34.980112 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:48:34.980119 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:48:34.980127 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 17 17:48:34.980134 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:48:34.980144 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 17:48:34.980213 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 17 17:48:34.980224 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:48:34.980231 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:48:34.980239 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 17 17:48:34.980275 systemd-journald[238]: Collecting audit messages is disabled. Mar 17 17:48:34.980298 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 17:48:34.980305 kernel: Bridge firewalling registered Mar 17 17:48:34.980313 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:48:34.980322 systemd-journald[238]: Journal started Mar 17 17:48:34.980341 systemd-journald[238]: Runtime Journal (/run/log/journal/59e6f4615cff4b2d88677c1d3dfd751d) is 5.9M, max 47.3M, 41.4M free. Mar 17 17:48:34.956914 systemd-modules-load[239]: Inserted module 'overlay' Mar 17 17:48:34.977785 systemd-modules-load[239]: Inserted module 'br_netfilter' Mar 17 17:48:34.986103 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:48:34.986124 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:48:34.989277 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:48:34.995250 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:48:35.009088 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 17 17:48:35.010717 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:48:35.012454 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:48:35.020080 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:48:35.023110 dracut-cmdline[268]: dracut-dracut-053 Mar 17 17:48:35.023271 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:48:35.026323 dracut-cmdline[268]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=f8298a09e890fc732131b7281e24befaf65b596eb5216e969c8eca4cab4a2b3a Mar 17 17:48:35.035069 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:48:35.065526 systemd-resolved[290]: Positive Trust Anchors: Mar 17 17:48:35.065541 systemd-resolved[290]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:48:35.065572 systemd-resolved[290]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:48:35.070081 systemd-resolved[290]: Defaulting to hostname 'linux'. Mar 17 17:48:35.071026 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:48:35.075169 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:48:35.098943 kernel: SCSI subsystem initialized Mar 17 17:48:35.103936 kernel: Loading iSCSI transport class v2.0-870. Mar 17 17:48:35.111945 kernel: iscsi: registered transport (tcp) Mar 17 17:48:35.124384 kernel: iscsi: registered transport (qla4xxx) Mar 17 17:48:35.124429 kernel: QLogic iSCSI HBA Driver Mar 17 17:48:35.169101 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 17 17:48:35.179076 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 17 17:48:35.194954 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 17:48:35.195006 kernel: device-mapper: uevent: version 1.0.3 Mar 17 17:48:35.196944 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 17 17:48:35.246000 kernel: raid6: neonx8 gen() 15751 MB/s Mar 17 17:48:35.262961 kernel: raid6: neonx4 gen() 15798 MB/s Mar 17 17:48:35.279959 kernel: raid6: neonx2 gen() 13271 MB/s Mar 17 17:48:35.296958 kernel: raid6: neonx1 gen() 10502 MB/s Mar 17 17:48:35.313959 kernel: raid6: int64x8 gen() 6770 MB/s Mar 17 17:48:35.330978 kernel: raid6: int64x4 gen() 7333 MB/s Mar 17 17:48:35.347960 kernel: raid6: int64x2 gen() 6093 MB/s Mar 17 17:48:35.365167 kernel: raid6: int64x1 gen() 5056 MB/s Mar 17 17:48:35.365222 kernel: raid6: using algorithm neonx4 gen() 15798 MB/s Mar 17 17:48:35.383132 kernel: raid6: .... xor() 12426 MB/s, rmw enabled Mar 17 17:48:35.383191 kernel: raid6: using neon recovery algorithm Mar 17 17:48:35.387949 kernel: xor: measuring software checksum speed Mar 17 17:48:35.389233 kernel: 8regs : 18094 MB/sec Mar 17 17:48:35.389262 kernel: 32regs : 21670 MB/sec Mar 17 17:48:35.390507 kernel: arm64_neon : 27794 MB/sec Mar 17 17:48:35.390530 kernel: xor: using function: arm64_neon (27794 MB/sec) Mar 17 17:48:35.443953 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 17 17:48:35.455804 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:48:35.467071 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:48:35.485475 systemd-udevd[462]: Using default interface naming scheme 'v255'. Mar 17 17:48:35.490219 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:48:35.497088 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 17 17:48:35.508424 dracut-pre-trigger[469]: rd.md=0: removing MD RAID activation Mar 17 17:48:35.537087 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:48:35.548089 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:48:35.588070 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:48:35.596084 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 17 17:48:35.613078 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 17 17:48:35.614609 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:48:35.618666 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:48:35.620222 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:48:35.628060 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 17 17:48:35.638148 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:48:35.654498 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Mar 17 17:48:35.662111 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 17 17:48:35.662213 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 17 17:48:35.662225 kernel: GPT:9289727 != 19775487 Mar 17 17:48:35.662233 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 17 17:48:35.662242 kernel: GPT:9289727 != 19775487 Mar 17 17:48:35.662252 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 17 17:48:35.662261 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:48:35.661038 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:48:35.661151 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:48:35.665079 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:48:35.666637 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:48:35.666777 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:48:35.670407 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:48:35.681584 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:48:35.686081 kernel: BTRFS: device fsid 5ecee764-de70-4de1-8711-3798360e0d13 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (512) Mar 17 17:48:35.690939 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (506) Mar 17 17:48:35.696220 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:48:35.705965 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 17 17:48:35.720994 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 17 17:48:35.737513 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 17 17:48:35.738821 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 17 17:48:35.749286 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 17 17:48:35.763071 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 17 17:48:35.764989 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 17 17:48:35.771787 disk-uuid[549]: Primary Header is updated. Mar 17 17:48:35.771787 disk-uuid[549]: Secondary Entries is updated. Mar 17 17:48:35.771787 disk-uuid[549]: Secondary Header is updated. Mar 17 17:48:35.780953 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:48:35.784826 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:48:35.788759 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:48:36.789947 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 17:48:36.790231 disk-uuid[550]: The operation has completed successfully. Mar 17 17:48:36.824120 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 17:48:36.824217 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 17 17:48:36.859085 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 17 17:48:36.861961 sh[572]: Success Mar 17 17:48:36.876939 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 17:48:36.906388 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 17 17:48:36.918372 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 17 17:48:36.920702 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 17 17:48:36.930516 kernel: BTRFS info (device dm-0): first mount of filesystem 5ecee764-de70-4de1-8711-3798360e0d13 Mar 17 17:48:36.930552 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:48:36.930563 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 17 17:48:36.932474 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 17 17:48:36.932504 kernel: BTRFS info (device dm-0): using free space tree Mar 17 17:48:36.936318 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 17 17:48:36.937662 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 17 17:48:36.942050 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 17 17:48:36.943583 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 17 17:48:36.955991 kernel: BTRFS info (device vda6): first mount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:48:36.956038 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:48:36.956944 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:48:36.959187 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:48:36.967116 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 17:48:36.968799 kernel: BTRFS info (device vda6): last unmount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:48:36.978957 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 17 17:48:36.985080 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 17 17:48:37.044012 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:48:37.056299 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:48:37.092938 systemd-networkd[765]: lo: Link UP Mar 17 17:48:37.092948 systemd-networkd[765]: lo: Gained carrier Mar 17 17:48:37.093730 systemd-networkd[765]: Enumeration completed Mar 17 17:48:37.093992 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:48:37.096021 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:48:37.096025 systemd-networkd[765]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:48:37.101900 ignition[680]: Ignition 2.20.0 Mar 17 17:48:37.096668 systemd-networkd[765]: eth0: Link UP Mar 17 17:48:37.101927 ignition[680]: Stage: fetch-offline Mar 17 17:48:37.096671 systemd-networkd[765]: eth0: Gained carrier Mar 17 17:48:37.101964 ignition[680]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:48:37.096677 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:48:37.101972 ignition[680]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:48:37.097657 systemd[1]: Reached target network.target - Network. Mar 17 17:48:37.102123 ignition[680]: parsed url from cmdline: "" Mar 17 17:48:37.102126 ignition[680]: no config URL provided Mar 17 17:48:37.102131 ignition[680]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 17:48:37.102139 ignition[680]: no config at "/usr/lib/ignition/user.ign" Mar 17 17:48:37.114605 systemd-networkd[765]: eth0: DHCPv4 address 10.0.0.115/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 17:48:37.102161 ignition[680]: op(1): [started] loading QEMU firmware config module Mar 17 17:48:37.102165 ignition[680]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 17 17:48:37.108769 ignition[680]: op(1): [finished] loading QEMU firmware config module Mar 17 17:48:37.108790 ignition[680]: QEMU firmware config was not found. Ignoring... Mar 17 17:48:37.154310 ignition[680]: parsing config with SHA512: 94b7596f58a7613e9cf5973afe84286b3937a2562c57a901fecb71573d05f73dfa871982068cfe55488cd91256c490017f98f8d336e249fd94b7383b4fdff366 Mar 17 17:48:37.158722 unknown[680]: fetched base config from "system" Mar 17 17:48:37.158732 unknown[680]: fetched user config from "qemu" Mar 17 17:48:37.159154 ignition[680]: fetch-offline: fetch-offline passed Mar 17 17:48:37.161593 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:48:37.159244 ignition[680]: Ignition finished successfully Mar 17 17:48:37.163261 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 17 17:48:37.177108 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 17 17:48:37.189572 ignition[771]: Ignition 2.20.0 Mar 17 17:48:37.189582 ignition[771]: Stage: kargs Mar 17 17:48:37.189734 ignition[771]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:48:37.189744 ignition[771]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:48:37.191245 ignition[771]: kargs: kargs passed Mar 17 17:48:37.194268 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 17 17:48:37.191343 ignition[771]: Ignition finished successfully Mar 17 17:48:37.206089 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 17 17:48:37.216128 ignition[779]: Ignition 2.20.0 Mar 17 17:48:37.216138 ignition[779]: Stage: disks Mar 17 17:48:37.216287 ignition[779]: no configs at "/usr/lib/ignition/base.d" Mar 17 17:48:37.218704 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 17 17:48:37.216296 ignition[779]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:48:37.220325 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 17 17:48:37.217107 ignition[779]: disks: disks passed Mar 17 17:48:37.222079 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 17 17:48:37.217148 ignition[779]: Ignition finished successfully Mar 17 17:48:37.224147 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:48:37.226011 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:48:37.227475 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:48:37.236104 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 17 17:48:37.244610 systemd-resolved[290]: Detected conflict on linux IN A 10.0.0.115 Mar 17 17:48:37.244625 systemd-resolved[290]: Hostname conflict, changing published hostname from 'linux' to 'linux9'. Mar 17 17:48:37.247698 systemd-fsck[792]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 17 17:48:37.252403 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 17 17:48:37.265061 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 17 17:48:37.307935 kernel: EXT4-fs (vda9): mounted filesystem 3914ef65-c5cd-468c-8ee7-964383d8e9e2 r/w with ordered data mode. Quota mode: none. Mar 17 17:48:37.308118 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 17 17:48:37.309397 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 17 17:48:37.327031 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:48:37.328712 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 17 17:48:37.329898 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 17 17:48:37.329960 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 17:48:37.330012 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:48:37.336200 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 17 17:48:37.340498 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (800) Mar 17 17:48:37.340525 kernel: BTRFS info (device vda6): first mount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:48:37.340536 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:48:37.339388 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 17 17:48:37.344987 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:48:37.345011 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:48:37.346423 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:48:37.378795 initrd-setup-root[826]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 17:48:37.381901 initrd-setup-root[833]: cut: /sysroot/etc/group: No such file or directory Mar 17 17:48:37.385944 initrd-setup-root[840]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 17:48:37.389480 initrd-setup-root[847]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 17:48:37.457888 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 17 17:48:37.470016 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 17 17:48:37.472537 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 17 17:48:37.477927 kernel: BTRFS info (device vda6): last unmount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:48:37.494922 ignition[915]: INFO : Ignition 2.20.0 Mar 17 17:48:37.494922 ignition[915]: INFO : Stage: mount Mar 17 17:48:37.494922 ignition[915]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:48:37.494922 ignition[915]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:48:37.494922 ignition[915]: INFO : mount: mount passed Mar 17 17:48:37.494922 ignition[915]: INFO : Ignition finished successfully Mar 17 17:48:37.495623 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 17 17:48:37.496851 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 17 17:48:37.505047 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 17 17:48:37.960365 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 17 17:48:37.972138 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 17 17:48:37.979004 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (930) Mar 17 17:48:37.979039 kernel: BTRFS info (device vda6): first mount of filesystem 8369c249-c0a6-415d-8511-1f18dbf3bf45 Mar 17 17:48:37.979050 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 17 17:48:37.979938 kernel: BTRFS info (device vda6): using free space tree Mar 17 17:48:37.982924 kernel: BTRFS info (device vda6): auto enabling async discard Mar 17 17:48:37.983832 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 17 17:48:37.999113 ignition[947]: INFO : Ignition 2.20.0 Mar 17 17:48:37.999113 ignition[947]: INFO : Stage: files Mar 17 17:48:38.000789 ignition[947]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:48:38.000789 ignition[947]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:48:38.000789 ignition[947]: DEBUG : files: compiled without relabeling support, skipping Mar 17 17:48:38.004295 ignition[947]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 17:48:38.004295 ignition[947]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 17:48:38.007080 ignition[947]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 17:48:38.007080 ignition[947]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 17:48:38.007080 ignition[947]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 17:48:38.006597 unknown[947]: wrote ssh authorized keys file for user: core Mar 17 17:48:38.012288 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:48:38.012288 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 17 17:48:38.063294 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 17 17:48:38.265919 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 17:48:38.265919 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:48:38.269769 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-arm64.raw: attempt #1 Mar 17 17:48:38.398050 systemd-networkd[765]: eth0: Gained IPv6LL Mar 17 17:48:38.605829 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 17 17:48:38.868385 ignition[947]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 17 17:48:38.868385 ignition[947]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 17 17:48:38.871817 ignition[947]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 17 17:48:38.887233 ignition[947]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:48:38.890896 ignition[947]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 17:48:38.893689 ignition[947]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 17 17:48:38.893689 ignition[947]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 17 17:48:38.893689 ignition[947]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 17:48:38.893689 ignition[947]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:48:38.893689 ignition[947]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 17:48:38.893689 ignition[947]: INFO : files: files passed Mar 17 17:48:38.893689 ignition[947]: INFO : Ignition finished successfully Mar 17 17:48:38.895436 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 17 17:48:38.906105 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 17 17:48:38.907788 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 17 17:48:38.911557 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 17:48:38.912525 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 17 17:48:38.916148 initrd-setup-root-after-ignition[976]: grep: /sysroot/oem/oem-release: No such file or directory Mar 17 17:48:38.918319 initrd-setup-root-after-ignition[978]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:48:38.918319 initrd-setup-root-after-ignition[978]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:48:38.921336 initrd-setup-root-after-ignition[982]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 17:48:38.920681 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:48:38.922652 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 17 17:48:38.933353 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 17 17:48:38.952122 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 17:48:38.952273 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 17 17:48:38.954565 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 17 17:48:38.956407 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 17 17:48:38.958245 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 17 17:48:38.959099 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 17 17:48:38.974946 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:48:38.987112 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 17 17:48:38.995169 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:48:38.996454 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:48:38.998517 systemd[1]: Stopped target timers.target - Timer Units. Mar 17 17:48:39.000253 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 17:48:39.000385 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 17 17:48:39.002938 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 17 17:48:39.004986 systemd[1]: Stopped target basic.target - Basic System. Mar 17 17:48:39.006581 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 17 17:48:39.008299 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 17 17:48:39.010218 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 17 17:48:39.012191 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 17 17:48:39.014004 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 17 17:48:39.016054 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 17 17:48:39.018015 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 17 17:48:39.019751 systemd[1]: Stopped target swap.target - Swaps. Mar 17 17:48:39.021294 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 17:48:39.021432 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 17 17:48:39.023696 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:48:39.024864 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:48:39.026841 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 17 17:48:39.026951 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:48:39.028954 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 17:48:39.029080 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 17 17:48:39.031680 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 17:48:39.031800 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 17 17:48:39.034214 systemd[1]: Stopped target paths.target - Path Units. Mar 17 17:48:39.035723 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 17:48:39.036450 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:48:39.037835 systemd[1]: Stopped target slices.target - Slice Units. Mar 17 17:48:39.039515 systemd[1]: Stopped target sockets.target - Socket Units. Mar 17 17:48:39.041393 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 17:48:39.041476 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 17 17:48:39.043003 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 17:48:39.043087 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 17 17:48:39.044865 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 17:48:39.045004 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 17 17:48:39.047295 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 17:48:39.047401 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 17 17:48:39.060078 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 17 17:48:39.061017 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 17:48:39.061154 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:48:39.064414 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 17 17:48:39.065807 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 17:48:39.065981 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:48:39.068648 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 17:48:39.068777 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 17 17:48:39.075143 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 17:48:39.079018 ignition[1002]: INFO : Ignition 2.20.0 Mar 17 17:48:39.079018 ignition[1002]: INFO : Stage: umount Mar 17 17:48:39.079018 ignition[1002]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 17:48:39.079018 ignition[1002]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 17:48:39.079018 ignition[1002]: INFO : umount: umount passed Mar 17 17:48:39.079018 ignition[1002]: INFO : Ignition finished successfully Mar 17 17:48:39.075229 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 17 17:48:39.078980 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 17:48:39.082182 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 17:48:39.082285 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 17 17:48:39.084838 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 17:48:39.084956 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 17 17:48:39.086694 systemd[1]: Stopped target network.target - Network. Mar 17 17:48:39.087662 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 17:48:39.087730 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 17 17:48:39.089564 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 17:48:39.089617 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 17 17:48:39.092157 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 17:48:39.092208 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 17 17:48:39.093942 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 17 17:48:39.093993 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 17 17:48:39.095747 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 17:48:39.095795 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 17 17:48:39.097682 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 17 17:48:39.099287 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 17 17:48:39.107220 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 17:48:39.107332 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 17 17:48:39.110752 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 17 17:48:39.111071 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 17 17:48:39.111114 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:48:39.114100 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 17 17:48:39.115268 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 17:48:39.115367 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 17 17:48:39.119716 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 17 17:48:39.119868 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 17:48:39.119896 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:48:39.126299 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 17 17:48:39.127179 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 17:48:39.127242 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 17 17:48:39.130193 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 17:48:39.130242 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:48:39.132866 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 17:48:39.132920 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 17 17:48:39.133987 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:48:39.137247 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 17:48:39.142480 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 17:48:39.142570 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 17 17:48:39.148570 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 17:48:39.148683 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:48:39.150095 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 17:48:39.150130 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 17 17:48:39.151763 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 17:48:39.151792 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:48:39.153488 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 17:48:39.153533 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 17 17:48:39.156264 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 17:48:39.156309 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 17 17:48:39.159001 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 17:48:39.159046 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 17 17:48:39.170060 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 17 17:48:39.171099 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 17:48:39.171160 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:48:39.174189 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 17:48:39.174237 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:48:39.177634 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 17:48:39.177723 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 17 17:48:39.180051 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 17 17:48:39.182239 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 17 17:48:39.190836 systemd[1]: Switching root. Mar 17 17:48:39.225333 systemd-journald[238]: Journal stopped Mar 17 17:48:40.005550 systemd-journald[238]: Received SIGTERM from PID 1 (systemd). Mar 17 17:48:40.005603 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 17:48:40.005615 kernel: SELinux: policy capability open_perms=1 Mar 17 17:48:40.005625 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 17:48:40.005636 kernel: SELinux: policy capability always_check_network=0 Mar 17 17:48:40.005645 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 17:48:40.005655 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 17:48:40.005664 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 17:48:40.005674 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 17:48:40.005687 kernel: audit: type=1403 audit(1742233719.374:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 17:48:40.005698 systemd[1]: Successfully loaded SELinux policy in 32.003ms. Mar 17 17:48:40.005726 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.575ms. Mar 17 17:48:40.005740 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 17 17:48:40.005751 systemd[1]: Detected virtualization kvm. Mar 17 17:48:40.005761 systemd[1]: Detected architecture arm64. Mar 17 17:48:40.005771 systemd[1]: Detected first boot. Mar 17 17:48:40.005781 systemd[1]: Initializing machine ID from VM UUID. Mar 17 17:48:40.005791 zram_generator::config[1049]: No configuration found. Mar 17 17:48:40.005804 kernel: NET: Registered PF_VSOCK protocol family Mar 17 17:48:40.005814 systemd[1]: Populated /etc with preset unit settings. Mar 17 17:48:40.005825 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 17 17:48:40.005837 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 17 17:48:40.005847 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 17 17:48:40.005857 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 17 17:48:40.005867 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 17 17:48:40.005878 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 17 17:48:40.005888 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 17 17:48:40.005901 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 17 17:48:40.005927 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 17 17:48:40.005948 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 17 17:48:40.005959 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 17 17:48:40.005969 systemd[1]: Created slice user.slice - User and Session Slice. Mar 17 17:48:40.005979 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 17 17:48:40.005990 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 17 17:48:40.006000 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 17 17:48:40.006010 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 17 17:48:40.006020 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 17 17:48:40.006031 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 17 17:48:40.006043 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 17 17:48:40.006055 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 17 17:48:40.006066 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 17 17:48:40.006076 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 17 17:48:40.006086 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 17 17:48:40.006097 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 17 17:48:40.006107 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 17 17:48:40.006118 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 17 17:48:40.006129 systemd[1]: Reached target slices.target - Slice Units. Mar 17 17:48:40.006140 systemd[1]: Reached target swap.target - Swaps. Mar 17 17:48:40.006150 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 17 17:48:40.006161 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 17 17:48:40.006171 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 17 17:48:40.006181 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 17 17:48:40.006191 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 17 17:48:40.006201 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 17 17:48:40.006212 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 17 17:48:40.006223 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 17 17:48:40.006234 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 17 17:48:40.006244 systemd[1]: Mounting media.mount - External Media Directory... Mar 17 17:48:40.006268 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 17 17:48:40.006278 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 17 17:48:40.006290 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 17 17:48:40.006301 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 17:48:40.006311 systemd[1]: Reached target machines.target - Containers. Mar 17 17:48:40.006322 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 17 17:48:40.006334 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:48:40.006344 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 17 17:48:40.006355 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 17 17:48:40.006364 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:48:40.006375 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:48:40.006385 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:48:40.006395 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 17 17:48:40.006406 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:48:40.006418 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 17:48:40.006428 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 17 17:48:40.006438 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 17 17:48:40.006448 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 17 17:48:40.006458 systemd[1]: Stopped systemd-fsck-usr.service. Mar 17 17:48:40.006467 kernel: fuse: init (API version 7.39) Mar 17 17:48:40.006478 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:48:40.006494 kernel: loop: module loaded Mar 17 17:48:40.006504 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 17 17:48:40.006516 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 17 17:48:40.006526 kernel: ACPI: bus type drm_connector registered Mar 17 17:48:40.006536 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 17 17:48:40.006547 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 17 17:48:40.006558 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 17 17:48:40.006570 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 17 17:48:40.006598 systemd-journald[1117]: Collecting audit messages is disabled. Mar 17 17:48:40.006619 systemd[1]: verity-setup.service: Deactivated successfully. Mar 17 17:48:40.006629 systemd[1]: Stopped verity-setup.service. Mar 17 17:48:40.006640 systemd-journald[1117]: Journal started Mar 17 17:48:40.006663 systemd-journald[1117]: Runtime Journal (/run/log/journal/59e6f4615cff4b2d88677c1d3dfd751d) is 5.9M, max 47.3M, 41.4M free. Mar 17 17:48:39.801808 systemd[1]: Queued start job for default target multi-user.target. Mar 17 17:48:39.814763 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 17 17:48:39.815192 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 17 17:48:40.010823 systemd[1]: Started systemd-journald.service - Journal Service. Mar 17 17:48:40.011402 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 17 17:48:40.012553 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 17 17:48:40.013805 systemd[1]: Mounted media.mount - External Media Directory. Mar 17 17:48:40.014897 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 17 17:48:40.016101 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 17 17:48:40.017294 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 17 17:48:40.019950 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 17 17:48:40.021360 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 17 17:48:40.022888 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 17:48:40.023106 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 17 17:48:40.024604 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:48:40.024756 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:48:40.026195 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:48:40.026354 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:48:40.027662 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:48:40.027812 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:48:40.029392 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 17:48:40.029539 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 17 17:48:40.030861 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:48:40.031102 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:48:40.032596 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 17 17:48:40.034021 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 17 17:48:40.036510 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 17 17:48:40.038083 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 17 17:48:40.050867 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 17 17:48:40.061003 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 17 17:48:40.063036 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 17 17:48:40.064250 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 17:48:40.064290 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 17 17:48:40.066209 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 17 17:48:40.068410 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 17 17:48:40.070495 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 17 17:48:40.071597 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:48:40.072827 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 17 17:48:40.077092 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 17 17:48:40.078265 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:48:40.082087 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 17 17:48:40.083311 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:48:40.086636 systemd-journald[1117]: Time spent on flushing to /var/log/journal/59e6f4615cff4b2d88677c1d3dfd751d is 23.427ms for 868 entries. Mar 17 17:48:40.086636 systemd-journald[1117]: System Journal (/var/log/journal/59e6f4615cff4b2d88677c1d3dfd751d) is 8M, max 195.6M, 187.6M free. Mar 17 17:48:40.118195 systemd-journald[1117]: Received client request to flush runtime journal. Mar 17 17:48:40.118244 kernel: loop0: detected capacity change from 0 to 113512 Mar 17 17:48:40.085153 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 17 17:48:40.088131 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 17 17:48:40.091209 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 17 17:48:40.095679 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 17 17:48:40.097234 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 17 17:48:40.098560 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 17 17:48:40.102032 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 17 17:48:40.104282 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 17 17:48:40.109324 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 17 17:48:40.122930 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 17:48:40.122960 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 17 17:48:40.125891 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 17 17:48:40.129950 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 17 17:48:40.134483 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 17 17:48:40.144985 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 17 17:48:40.151702 udevadm[1179]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 17 17:48:40.155928 kernel: loop1: detected capacity change from 0 to 189592 Mar 17 17:48:40.163280 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 17 17:48:40.174109 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 17 17:48:40.189985 kernel: loop2: detected capacity change from 0 to 123192 Mar 17 17:48:40.193397 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Mar 17 17:48:40.193415 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Mar 17 17:48:40.197288 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 17 17:48:40.230938 kernel: loop3: detected capacity change from 0 to 113512 Mar 17 17:48:40.236940 kernel: loop4: detected capacity change from 0 to 189592 Mar 17 17:48:40.241926 kernel: loop5: detected capacity change from 0 to 123192 Mar 17 17:48:40.245120 (sd-merge)[1191]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 17 17:48:40.245489 (sd-merge)[1191]: Merged extensions into '/usr'. Mar 17 17:48:40.248889 systemd[1]: Reload requested from client PID 1166 ('systemd-sysext') (unit systemd-sysext.service)... Mar 17 17:48:40.249094 systemd[1]: Reloading... Mar 17 17:48:40.312030 zram_generator::config[1217]: No configuration found. Mar 17 17:48:40.356299 ldconfig[1161]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 17:48:40.401576 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:48:40.450768 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 17:48:40.451095 systemd[1]: Reloading finished in 201 ms. Mar 17 17:48:40.473034 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 17 17:48:40.474640 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 17 17:48:40.491213 systemd[1]: Starting ensure-sysext.service... Mar 17 17:48:40.492974 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 17 17:48:40.508654 systemd[1]: Reload requested from client PID 1253 ('systemctl') (unit ensure-sysext.service)... Mar 17 17:48:40.508672 systemd[1]: Reloading... Mar 17 17:48:40.508700 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 17:48:40.508893 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 17 17:48:40.509531 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 17:48:40.509738 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. Mar 17 17:48:40.509791 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. Mar 17 17:48:40.512760 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:48:40.512773 systemd-tmpfiles[1254]: Skipping /boot Mar 17 17:48:40.521229 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. Mar 17 17:48:40.521246 systemd-tmpfiles[1254]: Skipping /boot Mar 17 17:48:40.552136 zram_generator::config[1283]: No configuration found. Mar 17 17:48:40.632081 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:48:40.681300 systemd[1]: Reloading finished in 172 ms. Mar 17 17:48:40.690711 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 17 17:48:40.706660 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 17 17:48:40.714289 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:48:40.717149 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 17 17:48:40.719602 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 17 17:48:40.727219 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 17 17:48:40.735604 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 17 17:48:40.741183 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 17 17:48:40.744708 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:48:40.747829 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:48:40.752642 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:48:40.759737 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:48:40.761128 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:48:40.761243 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:48:40.765417 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 17 17:48:40.769467 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:48:40.769678 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:48:40.773169 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 17 17:48:40.778418 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:48:40.778596 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:48:40.780854 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:48:40.781081 systemd-udevd[1324]: Using default interface naming scheme 'v255'. Mar 17 17:48:40.781364 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:48:40.789898 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:48:40.797311 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:48:40.800516 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:48:40.805099 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:48:40.806293 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:48:40.806420 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:48:40.808189 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 17 17:48:40.811759 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 17 17:48:40.816146 augenrules[1355]: No rules Mar 17 17:48:40.817788 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:48:40.818008 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:48:40.821930 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 17 17:48:40.824632 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 17 17:48:40.826348 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:48:40.826558 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:48:40.828277 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:48:40.828520 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:48:40.838589 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:48:40.839959 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:48:40.843957 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 17 17:48:40.847283 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 17 17:48:40.871849 systemd[1]: Finished ensure-sysext.service. Mar 17 17:48:40.876832 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 17 17:48:40.886189 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:48:40.887258 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 17 17:48:40.890135 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 17 17:48:40.894177 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 17 17:48:40.898165 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 17 17:48:40.901497 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 17 17:48:40.906324 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 17 17:48:40.906370 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 17 17:48:40.908527 augenrules[1396]: /sbin/augenrules: No change Mar 17 17:48:40.911312 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 17 17:48:40.920013 systemd-resolved[1322]: Positive Trust Anchors: Mar 17 17:48:40.921860 systemd-resolved[1322]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 17:48:40.925327 augenrules[1421]: No rules Mar 17 17:48:40.921891 systemd-resolved[1322]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 17 17:48:40.925154 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 17 17:48:40.926262 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 17:48:40.926945 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:48:40.927179 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:48:40.928526 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 17:48:40.928690 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 17 17:48:40.929738 systemd-resolved[1322]: Defaulting to hostname 'linux'. Mar 17 17:48:40.930163 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 17:48:40.930316 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 17 17:48:40.932313 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 17:48:40.932479 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 17 17:48:40.934002 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 17:48:40.934154 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 17 17:48:40.938662 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 17 17:48:40.940934 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1366) Mar 17 17:48:40.973781 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 17 17:48:40.975140 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 17 17:48:40.981424 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 17 17:48:40.982640 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 17:48:40.982704 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 17 17:48:41.002742 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 17 17:48:41.004644 systemd[1]: Reached target time-set.target - System Time Set. Mar 17 17:48:41.007341 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 17 17:48:41.020159 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 17 17:48:41.021005 systemd-networkd[1414]: lo: Link UP Mar 17 17:48:41.021242 systemd-networkd[1414]: lo: Gained carrier Mar 17 17:48:41.022361 systemd-networkd[1414]: Enumeration completed Mar 17 17:48:41.023016 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 17 17:48:41.024312 systemd[1]: Reached target network.target - Network. Mar 17 17:48:41.025352 systemd-networkd[1414]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:48:41.025361 systemd-networkd[1414]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 17:48:41.025882 systemd-networkd[1414]: eth0: Link UP Mar 17 17:48:41.025889 systemd-networkd[1414]: eth0: Gained carrier Mar 17 17:48:41.025901 systemd-networkd[1414]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 17 17:48:41.026783 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 17 17:48:41.029104 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 17 17:48:41.031860 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 17 17:48:41.035709 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 17 17:48:41.051016 systemd-networkd[1414]: eth0: DHCPv4 address 10.0.0.115/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 17:48:41.051782 systemd-timesyncd[1420]: Network configuration changed, trying to establish connection. Mar 17 17:48:41.053270 lvm[1445]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:48:41.056106 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 17 17:48:41.057875 systemd-timesyncd[1420]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 17 17:48:41.058071 systemd-timesyncd[1420]: Initial clock synchronization to Mon 2025-03-17 17:48:41.054358 UTC. Mar 17 17:48:41.075844 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 17 17:48:41.090061 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 17 17:48:41.091541 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 17 17:48:41.092705 systemd[1]: Reached target sysinit.target - System Initialization. Mar 17 17:48:41.093870 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 17 17:48:41.095113 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 17 17:48:41.096474 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 17 17:48:41.097669 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 17 17:48:41.099084 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 17 17:48:41.100279 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 17:48:41.100315 systemd[1]: Reached target paths.target - Path Units. Mar 17 17:48:41.101209 systemd[1]: Reached target timers.target - Timer Units. Mar 17 17:48:41.102991 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 17 17:48:41.105249 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 17 17:48:41.108326 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 17 17:48:41.109697 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 17 17:48:41.110969 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 17 17:48:41.115692 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 17 17:48:41.117127 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 17 17:48:41.119364 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 17 17:48:41.120976 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 17 17:48:41.122106 systemd[1]: Reached target sockets.target - Socket Units. Mar 17 17:48:41.123025 systemd[1]: Reached target basic.target - Basic System. Mar 17 17:48:41.123953 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:48:41.123977 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 17 17:48:41.124824 systemd[1]: Starting containerd.service - containerd container runtime... Mar 17 17:48:41.126790 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 17 17:48:41.129934 lvm[1454]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 17:48:41.131064 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 17 17:48:41.136098 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 17 17:48:41.136412 jq[1457]: false Mar 17 17:48:41.137324 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 17 17:48:41.140176 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 17 17:48:41.147046 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 17 17:48:41.149021 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 17 17:48:41.149396 dbus-daemon[1456]: [system] SELinux support is enabled Mar 17 17:48:41.151430 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 17 17:48:41.155369 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 17 17:48:41.162707 extend-filesystems[1458]: Found loop3 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found loop4 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found loop5 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda1 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda2 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda3 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found usr Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda4 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda6 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda7 Mar 17 17:48:41.162707 extend-filesystems[1458]: Found vda9 Mar 17 17:48:41.162707 extend-filesystems[1458]: Checking size of /dev/vda9 Mar 17 17:48:41.197627 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 17 17:48:41.157768 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 17:48:41.197794 extend-filesystems[1458]: Resized partition /dev/vda9 Mar 17 17:48:41.207681 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1370) Mar 17 17:48:41.158199 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 17 17:48:41.207899 extend-filesystems[1480]: resize2fs 1.47.1 (20-May-2024) Mar 17 17:48:41.160507 systemd[1]: Starting update-engine.service - Update Engine... Mar 17 17:48:41.163100 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 17 17:48:41.167601 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 17 17:48:41.213737 jq[1475]: true Mar 17 17:48:41.170718 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 17 17:48:41.174356 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 17:48:41.174552 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 17 17:48:41.214157 jq[1482]: true Mar 17 17:48:41.174810 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 17:48:41.175142 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 17 17:48:41.182329 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 17:48:41.182504 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 17 17:48:41.200242 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 17:48:41.200266 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 17 17:48:41.202647 (ntainerd)[1483]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 17 17:48:41.203797 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 17:48:41.203822 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 17 17:48:41.220886 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 17 17:48:41.223827 update_engine[1471]: I20250317 17:48:41.223662 1471 main.cc:92] Flatcar Update Engine starting Mar 17 17:48:41.234238 extend-filesystems[1480]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 17 17:48:41.234238 extend-filesystems[1480]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 17 17:48:41.234238 extend-filesystems[1480]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 17 17:48:41.248733 extend-filesystems[1458]: Resized filesystem in /dev/vda9 Mar 17 17:48:41.251083 tar[1481]: linux-arm64/helm Mar 17 17:48:41.235054 systemd[1]: Started update-engine.service - Update Engine. Mar 17 17:48:41.251370 update_engine[1471]: I20250317 17:48:41.233764 1471 update_check_scheduler.cc:74] Next update check in 11m45s Mar 17 17:48:41.238611 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 17:48:41.238885 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 17 17:48:41.249223 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 17 17:48:41.261203 systemd-logind[1469]: Watching system buttons on /dev/input/event0 (Power Button) Mar 17 17:48:41.261388 systemd-logind[1469]: New seat seat0. Mar 17 17:48:41.262410 systemd[1]: Started systemd-logind.service - User Login Management. Mar 17 17:48:41.296021 bash[1512]: Updated "/home/core/.ssh/authorized_keys" Mar 17 17:48:41.298289 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 17 17:48:41.300260 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 17 17:48:41.346957 locksmithd[1505]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 17:48:41.408486 sshd_keygen[1476]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 17:48:41.428068 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 17 17:48:41.430239 containerd[1483]: time="2025-03-17T17:48:41.430169600Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Mar 17 17:48:41.447187 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 17 17:48:41.454170 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 17:48:41.454357 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 17 17:48:41.462920 containerd[1483]: time="2025-03-17T17:48:41.462863120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464522 containerd[1483]: time="2025-03-17T17:48:41.464336960Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.83-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464522 containerd[1483]: time="2025-03-17T17:48:41.464369040Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 17:48:41.464522 containerd[1483]: time="2025-03-17T17:48:41.464385480Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 17:48:41.464639 containerd[1483]: time="2025-03-17T17:48:41.464541000Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 17 17:48:41.464639 containerd[1483]: time="2025-03-17T17:48:41.464560040Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464639 containerd[1483]: time="2025-03-17T17:48:41.464611760Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464639 containerd[1483]: time="2025-03-17T17:48:41.464623120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464854 containerd[1483]: time="2025-03-17T17:48:41.464809280Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464854 containerd[1483]: time="2025-03-17T17:48:41.464832640Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464854 containerd[1483]: time="2025-03-17T17:48:41.464845720Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:48:41.464854 containerd[1483]: time="2025-03-17T17:48:41.464854000Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 17:48:41.465270 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 17 17:48:41.466341 containerd[1483]: time="2025-03-17T17:48:41.465790520Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:48:41.466341 containerd[1483]: time="2025-03-17T17:48:41.466061160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 17:48:41.466341 containerd[1483]: time="2025-03-17T17:48:41.466193360Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 17:48:41.466341 containerd[1483]: time="2025-03-17T17:48:41.466207560Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 17:48:41.466341 containerd[1483]: time="2025-03-17T17:48:41.466278720Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 17:48:41.466609 containerd[1483]: time="2025-03-17T17:48:41.466587240Z" level=info msg="metadata content store policy set" policy=shared Mar 17 17:48:41.470870 containerd[1483]: time="2025-03-17T17:48:41.470843160Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 17:48:41.471091 containerd[1483]: time="2025-03-17T17:48:41.471059200Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 17:48:41.471172 containerd[1483]: time="2025-03-17T17:48:41.471158600Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 17 17:48:41.471324 containerd[1483]: time="2025-03-17T17:48:41.471307680Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 17 17:48:41.471388 containerd[1483]: time="2025-03-17T17:48:41.471375600Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 17:48:41.471597 containerd[1483]: time="2025-03-17T17:48:41.471574280Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 17:48:41.472131 containerd[1483]: time="2025-03-17T17:48:41.472108080Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472354960Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472389120Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472406880Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472420400Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472433400Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472445360Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472457800Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472471400Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472484640Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472496000Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472508000Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472527800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472540920Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.472938 containerd[1483]: time="2025-03-17T17:48:41.472552120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472571560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472583640Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472595440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472608520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472621120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472633480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472647480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472659000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472670400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472681960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472697320Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472717240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472731960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473267 containerd[1483]: time="2025-03-17T17:48:41.472754000Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.472953440Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.472972960Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.472984000Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.472995320Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.473008760Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.473023920Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.473033560Z" level=info msg="NRI interface is disabled by configuration." Mar 17 17:48:41.473504 containerd[1483]: time="2025-03-17T17:48:41.473045680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 17:48:41.473400 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 17 17:48:41.473684 containerd[1483]: time="2025-03-17T17:48:41.473371560Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 17:48:41.473684 containerd[1483]: time="2025-03-17T17:48:41.473414920Z" level=info msg="Connect containerd service" Mar 17 17:48:41.473684 containerd[1483]: time="2025-03-17T17:48:41.473504760Z" level=info msg="using legacy CRI server" Mar 17 17:48:41.473684 containerd[1483]: time="2025-03-17T17:48:41.473528480Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 17 17:48:41.473861 containerd[1483]: time="2025-03-17T17:48:41.473779240Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 17:48:41.474444 containerd[1483]: time="2025-03-17T17:48:41.474387840Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474632560Z" level=info msg="Start subscribing containerd event" Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474677000Z" level=info msg="Start recovering state" Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474736920Z" level=info msg="Start event monitor" Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474748920Z" level=info msg="Start snapshots syncer" Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474758800Z" level=info msg="Start cni network conf syncer for default" Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474766160Z" level=info msg="Start streaming server" Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474867840Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 17:48:41.474939 containerd[1483]: time="2025-03-17T17:48:41.474905400Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 17:48:41.475119 containerd[1483]: time="2025-03-17T17:48:41.474996960Z" level=info msg="containerd successfully booted in 0.045735s" Mar 17 17:48:41.475582 systemd[1]: Started containerd.service - containerd container runtime. Mar 17 17:48:41.482278 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 17 17:48:41.484360 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 17 17:48:41.485648 systemd[1]: Reached target getty.target - Login Prompts. Mar 17 17:48:41.598677 tar[1481]: linux-arm64/LICENSE Mar 17 17:48:41.598677 tar[1481]: linux-arm64/README.md Mar 17 17:48:41.613241 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 17 17:48:42.494024 systemd-networkd[1414]: eth0: Gained IPv6LL Mar 17 17:48:42.497958 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 17 17:48:42.499660 systemd[1]: Reached target network-online.target - Network is Online. Mar 17 17:48:42.513262 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 17 17:48:42.515607 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:48:42.517709 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 17 17:48:42.531328 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 17 17:48:42.531553 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 17 17:48:42.533861 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 17 17:48:42.549331 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 17 17:48:43.015690 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:48:43.017316 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 17 17:48:43.018817 systemd[1]: Startup finished in 610ms (kernel) + 4.642s (initrd) + 3.686s (userspace) = 8.939s. Mar 17 17:48:43.019329 (kubelet)[1572]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:48:43.651469 kubelet[1572]: E0317 17:48:43.651380 1572 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:48:43.653814 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:48:43.653982 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:48:43.654305 systemd[1]: kubelet.service: Consumed 942ms CPU time, 235.3M memory peak. Mar 17 17:48:47.204560 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 17 17:48:47.216220 systemd[1]: Started sshd@0-10.0.0.115:22-10.0.0.1:52242.service - OpenSSH per-connection server daemon (10.0.0.1:52242). Mar 17 17:48:47.279023 sshd[1585]: Accepted publickey for core from 10.0.0.1 port 52242 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:48:47.280768 sshd-session[1585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:48:47.297893 systemd-logind[1469]: New session 1 of user core. Mar 17 17:48:47.298267 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 17 17:48:47.316284 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 17 17:48:47.329878 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 17 17:48:47.333066 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 17 17:48:47.342025 (systemd)[1589]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 17:48:47.344277 systemd-logind[1469]: New session c1 of user core. Mar 17 17:48:47.444975 systemd[1589]: Queued start job for default target default.target. Mar 17 17:48:47.453794 systemd[1589]: Created slice app.slice - User Application Slice. Mar 17 17:48:47.453815 systemd[1589]: Reached target paths.target - Paths. Mar 17 17:48:47.453856 systemd[1589]: Reached target timers.target - Timers. Mar 17 17:48:47.455061 systemd[1589]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 17 17:48:47.463751 systemd[1589]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 17 17:48:47.463808 systemd[1589]: Reached target sockets.target - Sockets. Mar 17 17:48:47.463843 systemd[1589]: Reached target basic.target - Basic System. Mar 17 17:48:47.463870 systemd[1589]: Reached target default.target - Main User Target. Mar 17 17:48:47.463894 systemd[1589]: Startup finished in 114ms. Mar 17 17:48:47.464074 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 17 17:48:47.481071 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 17 17:48:47.549028 systemd[1]: Started sshd@1-10.0.0.115:22-10.0.0.1:52254.service - OpenSSH per-connection server daemon (10.0.0.1:52254). Mar 17 17:48:47.593604 sshd[1600]: Accepted publickey for core from 10.0.0.1 port 52254 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:48:47.594935 sshd-session[1600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:48:47.599000 systemd-logind[1469]: New session 2 of user core. Mar 17 17:48:47.611094 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 17 17:48:47.662226 sshd[1602]: Connection closed by 10.0.0.1 port 52254 Mar 17 17:48:47.663123 sshd-session[1600]: pam_unix(sshd:session): session closed for user core Mar 17 17:48:47.678396 systemd[1]: sshd@1-10.0.0.115:22-10.0.0.1:52254.service: Deactivated successfully. Mar 17 17:48:47.679838 systemd[1]: session-2.scope: Deactivated successfully. Mar 17 17:48:47.680554 systemd-logind[1469]: Session 2 logged out. Waiting for processes to exit. Mar 17 17:48:47.689184 systemd[1]: Started sshd@2-10.0.0.115:22-10.0.0.1:52262.service - OpenSSH per-connection server daemon (10.0.0.1:52262). Mar 17 17:48:47.689734 systemd-logind[1469]: Removed session 2. Mar 17 17:48:47.730029 sshd[1607]: Accepted publickey for core from 10.0.0.1 port 52262 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:48:47.731043 sshd-session[1607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:48:47.735956 systemd-logind[1469]: New session 3 of user core. Mar 17 17:48:47.745074 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 17 17:48:47.793200 sshd[1610]: Connection closed by 10.0.0.1 port 52262 Mar 17 17:48:47.793651 sshd-session[1607]: pam_unix(sshd:session): session closed for user core Mar 17 17:48:47.803111 systemd[1]: sshd@2-10.0.0.115:22-10.0.0.1:52262.service: Deactivated successfully. Mar 17 17:48:47.804665 systemd[1]: session-3.scope: Deactivated successfully. Mar 17 17:48:47.807037 systemd-logind[1469]: Session 3 logged out. Waiting for processes to exit. Mar 17 17:48:47.808777 systemd[1]: Started sshd@3-10.0.0.115:22-10.0.0.1:52270.service - OpenSSH per-connection server daemon (10.0.0.1:52270). Mar 17 17:48:47.809993 systemd-logind[1469]: Removed session 3. Mar 17 17:48:47.853624 sshd[1615]: Accepted publickey for core from 10.0.0.1 port 52270 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:48:47.854826 sshd-session[1615]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:48:47.859262 systemd-logind[1469]: New session 4 of user core. Mar 17 17:48:47.873466 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 17 17:48:47.926961 sshd[1618]: Connection closed by 10.0.0.1 port 52270 Mar 17 17:48:47.927270 sshd-session[1615]: pam_unix(sshd:session): session closed for user core Mar 17 17:48:47.941598 systemd[1]: sshd@3-10.0.0.115:22-10.0.0.1:52270.service: Deactivated successfully. Mar 17 17:48:47.943568 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 17:48:47.944568 systemd-logind[1469]: Session 4 logged out. Waiting for processes to exit. Mar 17 17:48:47.952353 systemd[1]: Started sshd@4-10.0.0.115:22-10.0.0.1:52278.service - OpenSSH per-connection server daemon (10.0.0.1:52278). Mar 17 17:48:47.953344 systemd-logind[1469]: Removed session 4. Mar 17 17:48:47.994578 sshd[1623]: Accepted publickey for core from 10.0.0.1 port 52278 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:48:47.995668 sshd-session[1623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:48:48.000535 systemd-logind[1469]: New session 5 of user core. Mar 17 17:48:48.020122 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 17 17:48:48.080749 sudo[1627]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 17:48:48.081062 sudo[1627]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:48:48.093717 sudo[1627]: pam_unix(sudo:session): session closed for user root Mar 17 17:48:48.095982 sshd[1626]: Connection closed by 10.0.0.1 port 52278 Mar 17 17:48:48.095778 sshd-session[1623]: pam_unix(sshd:session): session closed for user core Mar 17 17:48:48.114477 systemd[1]: sshd@4-10.0.0.115:22-10.0.0.1:52278.service: Deactivated successfully. Mar 17 17:48:48.115940 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 17:48:48.117632 systemd-logind[1469]: Session 5 logged out. Waiting for processes to exit. Mar 17 17:48:48.118726 systemd[1]: Started sshd@5-10.0.0.115:22-10.0.0.1:52284.service - OpenSSH per-connection server daemon (10.0.0.1:52284). Mar 17 17:48:48.119516 systemd-logind[1469]: Removed session 5. Mar 17 17:48:48.163971 sshd[1632]: Accepted publickey for core from 10.0.0.1 port 52284 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:48:48.165420 sshd-session[1632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:48:48.169964 systemd-logind[1469]: New session 6 of user core. Mar 17 17:48:48.180050 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 17 17:48:48.232209 sudo[1637]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 17:48:48.232865 sudo[1637]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:48:48.236239 sudo[1637]: pam_unix(sudo:session): session closed for user root Mar 17 17:48:48.241876 sudo[1636]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 17 17:48:48.242168 sudo[1636]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:48:48.258564 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 17 17:48:48.282705 augenrules[1659]: No rules Mar 17 17:48:48.284066 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 17:48:48.284308 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 17 17:48:48.285352 sudo[1636]: pam_unix(sudo:session): session closed for user root Mar 17 17:48:48.288546 sshd[1635]: Connection closed by 10.0.0.1 port 52284 Mar 17 17:48:48.288418 sshd-session[1632]: pam_unix(sshd:session): session closed for user core Mar 17 17:48:48.297854 systemd[1]: sshd@5-10.0.0.115:22-10.0.0.1:52284.service: Deactivated successfully. Mar 17 17:48:48.299816 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 17:48:48.300558 systemd-logind[1469]: Session 6 logged out. Waiting for processes to exit. Mar 17 17:48:48.320303 systemd[1]: Started sshd@6-10.0.0.115:22-10.0.0.1:52292.service - OpenSSH per-connection server daemon (10.0.0.1:52292). Mar 17 17:48:48.321318 systemd-logind[1469]: Removed session 6. Mar 17 17:48:48.361111 sshd[1667]: Accepted publickey for core from 10.0.0.1 port 52292 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:48:48.362307 sshd-session[1667]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:48:48.366052 systemd-logind[1469]: New session 7 of user core. Mar 17 17:48:48.387071 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 17 17:48:48.436838 sudo[1671]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 17:48:48.437439 sudo[1671]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 17 17:48:48.794201 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 17 17:48:48.794268 (dockerd)[1691]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 17 17:48:49.052025 dockerd[1691]: time="2025-03-17T17:48:49.051747635Z" level=info msg="Starting up" Mar 17 17:48:49.229497 dockerd[1691]: time="2025-03-17T17:48:49.229455994Z" level=info msg="Loading containers: start." Mar 17 17:48:49.388948 kernel: Initializing XFRM netlink socket Mar 17 17:48:49.453125 systemd-networkd[1414]: docker0: Link UP Mar 17 17:48:49.487122 dockerd[1691]: time="2025-03-17T17:48:49.487076741Z" level=info msg="Loading containers: done." Mar 17 17:48:49.502213 dockerd[1691]: time="2025-03-17T17:48:49.502165567Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 17:48:49.502361 dockerd[1691]: time="2025-03-17T17:48:49.502247200Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Mar 17 17:48:49.502444 dockerd[1691]: time="2025-03-17T17:48:49.502415345Z" level=info msg="Daemon has completed initialization" Mar 17 17:48:49.528812 dockerd[1691]: time="2025-03-17T17:48:49.528386638Z" level=info msg="API listen on /run/docker.sock" Mar 17 17:48:49.528539 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 17 17:48:50.134512 containerd[1483]: time="2025-03-17T17:48:50.134462292Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 17 17:48:50.727109 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount440910917.mount: Deactivated successfully. Mar 17 17:48:51.800488 containerd[1483]: time="2025-03-17T17:48:51.800433239Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:51.800961 containerd[1483]: time="2025-03-17T17:48:51.800930159Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=25552768" Mar 17 17:48:51.801816 containerd[1483]: time="2025-03-17T17:48:51.801786610Z" level=info msg="ImageCreate event name:\"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:51.805833 containerd[1483]: time="2025-03-17T17:48:51.805687856Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:51.806821 containerd[1483]: time="2025-03-17T17:48:51.806769089Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"25549566\" in 1.672255481s" Mar 17 17:48:51.806821 containerd[1483]: time="2025-03-17T17:48:51.806802767Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\"" Mar 17 17:48:51.807734 containerd[1483]: time="2025-03-17T17:48:51.807688855Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 17 17:48:53.279799 containerd[1483]: time="2025-03-17T17:48:53.279746308Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:53.280824 containerd[1483]: time="2025-03-17T17:48:53.280546527Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=22458980" Mar 17 17:48:53.282044 containerd[1483]: time="2025-03-17T17:48:53.282006897Z" level=info msg="ImageCreate event name:\"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:53.285008 containerd[1483]: time="2025-03-17T17:48:53.284898119Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:53.286235 containerd[1483]: time="2025-03-17T17:48:53.286099868Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"23899774\" in 1.478368016s" Mar 17 17:48:53.286235 containerd[1483]: time="2025-03-17T17:48:53.286141625Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\"" Mar 17 17:48:53.287745 containerd[1483]: time="2025-03-17T17:48:53.287537399Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 17 17:48:53.904293 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 17:48:53.914133 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:48:54.013869 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:48:54.019485 (kubelet)[1952]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 17 17:48:54.065422 kubelet[1952]: E0317 17:48:54.065347 1952 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 17:48:54.068284 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 17:48:54.068434 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 17:48:54.068734 systemd[1]: kubelet.service: Consumed 138ms CPU time, 96.7M memory peak. Mar 17 17:48:54.758183 containerd[1483]: time="2025-03-17T17:48:54.758127231Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:54.759155 containerd[1483]: time="2025-03-17T17:48:54.758676630Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=17125831" Mar 17 17:48:54.759867 containerd[1483]: time="2025-03-17T17:48:54.759831186Z" level=info msg="ImageCreate event name:\"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:54.763376 containerd[1483]: time="2025-03-17T17:48:54.763346329Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:54.765013 containerd[1483]: time="2025-03-17T17:48:54.764968690Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"18566643\" in 1.477387014s" Mar 17 17:48:54.765013 containerd[1483]: time="2025-03-17T17:48:54.765008567Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\"" Mar 17 17:48:54.765481 containerd[1483]: time="2025-03-17T17:48:54.765458734Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 17 17:48:55.695003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3045423605.mount: Deactivated successfully. Mar 17 17:48:55.916482 containerd[1483]: time="2025-03-17T17:48:55.916159786Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:55.916884 containerd[1483]: time="2025-03-17T17:48:55.916483643Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=26871917" Mar 17 17:48:55.917489 containerd[1483]: time="2025-03-17T17:48:55.917431895Z" level=info msg="ImageCreate event name:\"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:55.919658 containerd[1483]: time="2025-03-17T17:48:55.919396076Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:55.920322 containerd[1483]: time="2025-03-17T17:48:55.920264855Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"26870934\" in 1.154770963s" Mar 17 17:48:55.920322 containerd[1483]: time="2025-03-17T17:48:55.920302772Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\"" Mar 17 17:48:55.921129 containerd[1483]: time="2025-03-17T17:48:55.920981964Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 17:48:56.394608 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3920712127.mount: Deactivated successfully. Mar 17 17:48:57.144791 containerd[1483]: time="2025-03-17T17:48:57.144741621Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:57.145850 containerd[1483]: time="2025-03-17T17:48:57.145774032Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485383" Mar 17 17:48:57.146453 containerd[1483]: time="2025-03-17T17:48:57.146412910Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:57.150435 containerd[1483]: time="2025-03-17T17:48:57.149815363Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:57.151547 containerd[1483]: time="2025-03-17T17:48:57.151193552Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.23017967s" Mar 17 17:48:57.151547 containerd[1483]: time="2025-03-17T17:48:57.151229909Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 17 17:48:57.151804 containerd[1483]: time="2025-03-17T17:48:57.151768113Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 17 17:48:57.590757 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4203472875.mount: Deactivated successfully. Mar 17 17:48:57.595140 containerd[1483]: time="2025-03-17T17:48:57.595078263Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:57.595753 containerd[1483]: time="2025-03-17T17:48:57.595702182Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Mar 17 17:48:57.596441 containerd[1483]: time="2025-03-17T17:48:57.596405815Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:57.599462 containerd[1483]: time="2025-03-17T17:48:57.599425494Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:57.600610 containerd[1483]: time="2025-03-17T17:48:57.600465145Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 448.660873ms" Mar 17 17:48:57.600610 containerd[1483]: time="2025-03-17T17:48:57.600506902Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 17 17:48:57.601208 containerd[1483]: time="2025-03-17T17:48:57.601154979Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 17 17:48:58.089189 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3773387490.mount: Deactivated successfully. Mar 17 17:48:59.517592 containerd[1483]: time="2025-03-17T17:48:59.517530529Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:59.518104 containerd[1483]: time="2025-03-17T17:48:59.518048577Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406427" Mar 17 17:48:59.520222 containerd[1483]: time="2025-03-17T17:48:59.520191243Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:59.523773 containerd[1483]: time="2025-03-17T17:48:59.523737062Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:48:59.525481 containerd[1483]: time="2025-03-17T17:48:59.525443635Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 1.924259259s" Mar 17 17:48:59.525518 containerd[1483]: time="2025-03-17T17:48:59.525481233Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Mar 17 17:49:04.241117 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 17:49:04.250146 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:49:04.301801 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 17 17:49:04.301875 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 17 17:49:04.302174 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:49:04.318241 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:49:04.341705 systemd[1]: Reload requested from client PID 2109 ('systemctl') (unit session-7.scope)... Mar 17 17:49:04.341723 systemd[1]: Reloading... Mar 17 17:49:04.406933 zram_generator::config[2153]: No configuration found. Mar 17 17:49:04.521401 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:49:04.593089 systemd[1]: Reloading finished in 251 ms. Mar 17 17:49:04.639071 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:49:04.641501 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:49:04.642666 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:49:04.642856 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:49:04.642894 systemd[1]: kubelet.service: Consumed 76ms CPU time, 82.4M memory peak. Mar 17 17:49:04.644238 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:49:04.737360 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:49:04.741203 (kubelet)[2200]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:49:04.776171 kubelet[2200]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:49:04.776171 kubelet[2200]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:49:04.776171 kubelet[2200]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:49:04.776487 kubelet[2200]: I0317 17:49:04.776267 2200 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:49:05.899196 kubelet[2200]: I0317 17:49:05.899008 2200 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 17 17:49:05.899196 kubelet[2200]: I0317 17:49:05.899043 2200 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:49:05.899559 kubelet[2200]: I0317 17:49:05.899523 2200 server.go:929] "Client rotation is on, will bootstrap in background" Mar 17 17:49:05.940420 kubelet[2200]: E0317 17:49:05.940380 2200 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.115:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:05.943924 kubelet[2200]: I0317 17:49:05.943887 2200 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:49:05.953217 kubelet[2200]: E0317 17:49:05.953159 2200 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 17 17:49:05.953217 kubelet[2200]: I0317 17:49:05.953193 2200 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 17 17:49:05.956581 kubelet[2200]: I0317 17:49:05.956551 2200 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:49:05.957398 kubelet[2200]: I0317 17:49:05.957371 2200 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 17 17:49:05.957551 kubelet[2200]: I0317 17:49:05.957508 2200 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:49:05.957710 kubelet[2200]: I0317 17:49:05.957540 2200 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 17:49:05.957798 kubelet[2200]: I0317 17:49:05.957771 2200 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:49:05.957798 kubelet[2200]: I0317 17:49:05.957781 2200 container_manager_linux.go:300] "Creating device plugin manager" Mar 17 17:49:05.958002 kubelet[2200]: I0317 17:49:05.957971 2200 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:49:05.959951 kubelet[2200]: I0317 17:49:05.959544 2200 kubelet.go:408] "Attempting to sync node with API server" Mar 17 17:49:05.959951 kubelet[2200]: I0317 17:49:05.959569 2200 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:49:05.959951 kubelet[2200]: I0317 17:49:05.959657 2200 kubelet.go:314] "Adding apiserver pod source" Mar 17 17:49:05.959951 kubelet[2200]: I0317 17:49:05.959666 2200 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:49:05.961200 kubelet[2200]: I0317 17:49:05.961178 2200 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:49:05.963562 kubelet[2200]: W0317 17:49:05.963503 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.115:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:05.963630 kubelet[2200]: E0317 17:49:05.963572 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.115:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:05.963780 kubelet[2200]: W0317 17:49:05.963743 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.115:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:05.963855 kubelet[2200]: E0317 17:49:05.963839 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.115:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:05.963903 kubelet[2200]: I0317 17:49:05.963871 2200 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:49:05.967299 kubelet[2200]: W0317 17:49:05.967277 2200 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 17:49:05.968269 kubelet[2200]: I0317 17:49:05.968157 2200 server.go:1269] "Started kubelet" Mar 17 17:49:05.971003 kubelet[2200]: I0317 17:49:05.970076 2200 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:49:05.971003 kubelet[2200]: I0317 17:49:05.970243 2200 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:49:05.971003 kubelet[2200]: I0317 17:49:05.970474 2200 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:49:05.971226 kubelet[2200]: I0317 17:49:05.971198 2200 server.go:460] "Adding debug handlers to kubelet server" Mar 17 17:49:05.971472 kubelet[2200]: I0317 17:49:05.971454 2200 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:49:05.975875 kubelet[2200]: I0317 17:49:05.975853 2200 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 17 17:49:05.976948 kubelet[2200]: I0317 17:49:05.976930 2200 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 17 17:49:05.977168 kubelet[2200]: I0317 17:49:05.977146 2200 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 17 17:49:05.977565 kubelet[2200]: W0317 17:49:05.977467 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.115:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:05.977565 kubelet[2200]: E0317 17:49:05.977523 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.115:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:05.977638 kubelet[2200]: I0317 17:49:05.977611 2200 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:49:05.977815 kubelet[2200]: E0317 17:49:05.977771 2200 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:49:05.977891 kubelet[2200]: E0317 17:49:05.977851 2200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.115:6443: connect: connection refused" interval="200ms" Mar 17 17:49:05.978066 kubelet[2200]: I0317 17:49:05.978044 2200 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:49:05.978127 kubelet[2200]: I0317 17:49:05.978109 2200 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:49:05.979009 kubelet[2200]: E0317 17:49:05.978985 2200 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:49:05.979536 kubelet[2200]: I0317 17:49:05.979507 2200 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:49:05.981030 kubelet[2200]: E0317 17:49:05.978482 2200 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.115:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.115:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182da85c7820fe06 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-17 17:49:05.968135686 +0000 UTC m=+1.223940737,LastTimestamp:2025-03-17 17:49:05.968135686 +0000 UTC m=+1.223940737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 17 17:49:05.989372 kubelet[2200]: I0317 17:49:05.989344 2200 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:49:05.989372 kubelet[2200]: I0317 17:49:05.989364 2200 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:49:05.989372 kubelet[2200]: I0317 17:49:05.989380 2200 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:49:05.989589 kubelet[2200]: I0317 17:49:05.989560 2200 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:49:05.990528 kubelet[2200]: I0317 17:49:05.990498 2200 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:49:05.990528 kubelet[2200]: I0317 17:49:05.990522 2200 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:49:05.990619 kubelet[2200]: I0317 17:49:05.990540 2200 kubelet.go:2321] "Starting kubelet main sync loop" Mar 17 17:49:05.990619 kubelet[2200]: E0317 17:49:05.990574 2200 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:49:06.046844 kubelet[2200]: I0317 17:49:06.046780 2200 policy_none.go:49] "None policy: Start" Mar 17 17:49:06.047355 kubelet[2200]: W0317 17:49:06.047305 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.115:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:06.047470 kubelet[2200]: E0317 17:49:06.047364 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.115:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:06.047532 kubelet[2200]: I0317 17:49:06.047508 2200 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:49:06.047559 kubelet[2200]: I0317 17:49:06.047535 2200 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:49:06.054121 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 17 17:49:06.071692 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 17 17:49:06.074401 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 17 17:49:06.078035 kubelet[2200]: E0317 17:49:06.078008 2200 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:49:06.084558 kubelet[2200]: I0317 17:49:06.084525 2200 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:49:06.084718 kubelet[2200]: I0317 17:49:06.084696 2200 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 17:49:06.084751 kubelet[2200]: I0317 17:49:06.084711 2200 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:49:06.084995 kubelet[2200]: I0317 17:49:06.084978 2200 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:49:06.085797 kubelet[2200]: E0317 17:49:06.085769 2200 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 17 17:49:06.099619 systemd[1]: Created slice kubepods-burstable-pod1215b5bf44b06dfc18fa431008c5700d.slice - libcontainer container kubepods-burstable-pod1215b5bf44b06dfc18fa431008c5700d.slice. Mar 17 17:49:06.131360 systemd[1]: Created slice kubepods-burstable-pod60762308083b5ef6c837b1be48ec53d6.slice - libcontainer container kubepods-burstable-pod60762308083b5ef6c837b1be48ec53d6.slice. Mar 17 17:49:06.135384 systemd[1]: Created slice kubepods-burstable-pod6f32907a07e55aea05abdc5cd284a8d5.slice - libcontainer container kubepods-burstable-pod6f32907a07e55aea05abdc5cd284a8d5.slice. Mar 17 17:49:06.178333 kubelet[2200]: I0317 17:49:06.178222 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1215b5bf44b06dfc18fa431008c5700d-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"1215b5bf44b06dfc18fa431008c5700d\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:49:06.178458 kubelet[2200]: E0317 17:49:06.178253 2200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.115:6443: connect: connection refused" interval="400ms" Mar 17 17:49:06.187452 kubelet[2200]: I0317 17:49:06.187428 2200 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 17 17:49:06.187769 kubelet[2200]: E0317 17:49:06.187748 2200 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.115:6443/api/v1/nodes\": dial tcp 10.0.0.115:6443: connect: connection refused" node="localhost" Mar 17 17:49:06.278623 kubelet[2200]: I0317 17:49:06.278585 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1215b5bf44b06dfc18fa431008c5700d-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"1215b5bf44b06dfc18fa431008c5700d\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:49:06.278755 kubelet[2200]: I0317 17:49:06.278640 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:06.278755 kubelet[2200]: I0317 17:49:06.278675 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:06.278755 kubelet[2200]: I0317 17:49:06.278702 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f32907a07e55aea05abdc5cd284a8d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6f32907a07e55aea05abdc5cd284a8d5\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:49:06.278755 kubelet[2200]: I0317 17:49:06.278750 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:06.279012 kubelet[2200]: I0317 17:49:06.278821 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:06.279012 kubelet[2200]: I0317 17:49:06.278855 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:06.279012 kubelet[2200]: I0317 17:49:06.278871 2200 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1215b5bf44b06dfc18fa431008c5700d-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"1215b5bf44b06dfc18fa431008c5700d\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:49:06.388941 kubelet[2200]: I0317 17:49:06.388895 2200 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 17 17:49:06.389215 kubelet[2200]: E0317 17:49:06.389183 2200 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.115:6443/api/v1/nodes\": dial tcp 10.0.0.115:6443: connect: connection refused" node="localhost" Mar 17 17:49:06.430117 containerd[1483]: time="2025-03-17T17:49:06.430019184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:1215b5bf44b06dfc18fa431008c5700d,Namespace:kube-system,Attempt:0,}" Mar 17 17:49:06.434846 containerd[1483]: time="2025-03-17T17:49:06.434718469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:60762308083b5ef6c837b1be48ec53d6,Namespace:kube-system,Attempt:0,}" Mar 17 17:49:06.438473 containerd[1483]: time="2025-03-17T17:49:06.438448963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6f32907a07e55aea05abdc5cd284a8d5,Namespace:kube-system,Attempt:0,}" Mar 17 17:49:06.579425 kubelet[2200]: E0317 17:49:06.579345 2200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.115:6443: connect: connection refused" interval="800ms" Mar 17 17:49:06.790846 kubelet[2200]: I0317 17:49:06.790818 2200 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 17 17:49:06.791204 kubelet[2200]: E0317 17:49:06.791154 2200 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.115:6443/api/v1/nodes\": dial tcp 10.0.0.115:6443: connect: connection refused" node="localhost" Mar 17 17:49:06.888544 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2538469300.mount: Deactivated successfully. Mar 17 17:49:06.890331 containerd[1483]: time="2025-03-17T17:49:06.890173902Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:49:06.892231 containerd[1483]: time="2025-03-17T17:49:06.892175642Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269175" Mar 17 17:49:06.894211 containerd[1483]: time="2025-03-17T17:49:06.894144143Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:49:06.894984 containerd[1483]: time="2025-03-17T17:49:06.894936864Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:49:06.896264 containerd[1483]: time="2025-03-17T17:49:06.896161522Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:49:06.897354 containerd[1483]: time="2025-03-17T17:49:06.897291866Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:49:06.897604 containerd[1483]: time="2025-03-17T17:49:06.897581891Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 17 17:49:06.901220 containerd[1483]: time="2025-03-17T17:49:06.901158353Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 17 17:49:06.902292 containerd[1483]: time="2025-03-17T17:49:06.902077547Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 463.429874ms" Mar 17 17:49:06.903564 containerd[1483]: time="2025-03-17T17:49:06.903526034Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 468.750728ms" Mar 17 17:49:06.904278 containerd[1483]: time="2025-03-17T17:49:06.904248358Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 474.150137ms" Mar 17 17:49:06.934086 kubelet[2200]: W0317 17:49:06.934015 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.115:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:06.934086 kubelet[2200]: E0317 17:49:06.934090 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.115:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:07.063018 containerd[1483]: time="2025-03-17T17:49:07.061795057Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:07.063018 containerd[1483]: time="2025-03-17T17:49:07.061860894Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:07.063018 containerd[1483]: time="2025-03-17T17:49:07.061875733Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:07.063018 containerd[1483]: time="2025-03-17T17:49:07.061968529Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:07.065517 containerd[1483]: time="2025-03-17T17:49:07.065413522Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:07.065936 containerd[1483]: time="2025-03-17T17:49:07.065629951Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:07.066268 containerd[1483]: time="2025-03-17T17:49:07.066176405Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:07.066438 containerd[1483]: time="2025-03-17T17:49:07.066399034Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:07.066999 containerd[1483]: time="2025-03-17T17:49:07.066882691Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:07.066999 containerd[1483]: time="2025-03-17T17:49:07.066975766Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:07.067120 containerd[1483]: time="2025-03-17T17:49:07.066987726Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:07.067312 containerd[1483]: time="2025-03-17T17:49:07.067269032Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:07.082154 systemd[1]: Started cri-containerd-4d0a75534c2cd8049f8dd51ba0a463070d87ad94c9ebadb79a1bec8b1493a19f.scope - libcontainer container 4d0a75534c2cd8049f8dd51ba0a463070d87ad94c9ebadb79a1bec8b1493a19f. Mar 17 17:49:07.087100 systemd[1]: Started cri-containerd-1881f1abc8a0ad9a81025627fd60a20042d305d002d9d2f942b2f568583dcafd.scope - libcontainer container 1881f1abc8a0ad9a81025627fd60a20042d305d002d9d2f942b2f568583dcafd. Mar 17 17:49:07.088782 systemd[1]: Started cri-containerd-8c4f597256d12578dd2e0bd11d3fc1474875465cf5f0b11c3599f5bab35c106f.scope - libcontainer container 8c4f597256d12578dd2e0bd11d3fc1474875465cf5f0b11c3599f5bab35c106f. Mar 17 17:49:07.101019 kubelet[2200]: W0317 17:49:07.100811 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.115:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:07.101019 kubelet[2200]: E0317 17:49:07.100885 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.115:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:07.117065 containerd[1483]: time="2025-03-17T17:49:07.116311657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:1215b5bf44b06dfc18fa431008c5700d,Namespace:kube-system,Attempt:0,} returns sandbox id \"4d0a75534c2cd8049f8dd51ba0a463070d87ad94c9ebadb79a1bec8b1493a19f\"" Mar 17 17:49:07.121278 containerd[1483]: time="2025-03-17T17:49:07.121218939Z" level=info msg="CreateContainer within sandbox \"4d0a75534c2cd8049f8dd51ba0a463070d87ad94c9ebadb79a1bec8b1493a19f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 17:49:07.129332 containerd[1483]: time="2025-03-17T17:49:07.129285389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:60762308083b5ef6c837b1be48ec53d6,Namespace:kube-system,Attempt:0,} returns sandbox id \"8c4f597256d12578dd2e0bd11d3fc1474875465cf5f0b11c3599f5bab35c106f\"" Mar 17 17:49:07.132568 containerd[1483]: time="2025-03-17T17:49:07.132527392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6f32907a07e55aea05abdc5cd284a8d5,Namespace:kube-system,Attempt:0,} returns sandbox id \"1881f1abc8a0ad9a81025627fd60a20042d305d002d9d2f942b2f568583dcafd\"" Mar 17 17:49:07.132815 containerd[1483]: time="2025-03-17T17:49:07.132757381Z" level=info msg="CreateContainer within sandbox \"8c4f597256d12578dd2e0bd11d3fc1474875465cf5f0b11c3599f5bab35c106f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 17:49:07.135705 containerd[1483]: time="2025-03-17T17:49:07.135672519Z" level=info msg="CreateContainer within sandbox \"1881f1abc8a0ad9a81025627fd60a20042d305d002d9d2f942b2f568583dcafd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 17:49:07.141922 containerd[1483]: time="2025-03-17T17:49:07.141864100Z" level=info msg="CreateContainer within sandbox \"4d0a75534c2cd8049f8dd51ba0a463070d87ad94c9ebadb79a1bec8b1493a19f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"be6c5a7e7ef6413b57d4ddfd843aa31fbb22194ad60ff063bd9384eea21ec110\"" Mar 17 17:49:07.142801 containerd[1483]: time="2025-03-17T17:49:07.142772776Z" level=info msg="StartContainer for \"be6c5a7e7ef6413b57d4ddfd843aa31fbb22194ad60ff063bd9384eea21ec110\"" Mar 17 17:49:07.149950 containerd[1483]: time="2025-03-17T17:49:07.149893151Z" level=info msg="CreateContainer within sandbox \"8c4f597256d12578dd2e0bd11d3fc1474875465cf5f0b11c3599f5bab35c106f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"0ef5684a1e61cf7e0aa088d932128ce4dd90d2fc6ad5e1b175550d8b26d239ef\"" Mar 17 17:49:07.150574 containerd[1483]: time="2025-03-17T17:49:07.150549199Z" level=info msg="StartContainer for \"0ef5684a1e61cf7e0aa088d932128ce4dd90d2fc6ad5e1b175550d8b26d239ef\"" Mar 17 17:49:07.153414 containerd[1483]: time="2025-03-17T17:49:07.153366423Z" level=info msg="CreateContainer within sandbox \"1881f1abc8a0ad9a81025627fd60a20042d305d002d9d2f942b2f568583dcafd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8a0832c31655fa4807edda7635b81ec63bfdae43046f3393a43418d171efd303\"" Mar 17 17:49:07.154017 containerd[1483]: time="2025-03-17T17:49:07.153993832Z" level=info msg="StartContainer for \"8a0832c31655fa4807edda7635b81ec63bfdae43046f3393a43418d171efd303\"" Mar 17 17:49:07.174106 systemd[1]: Started cri-containerd-be6c5a7e7ef6413b57d4ddfd843aa31fbb22194ad60ff063bd9384eea21ec110.scope - libcontainer container be6c5a7e7ef6413b57d4ddfd843aa31fbb22194ad60ff063bd9384eea21ec110. Mar 17 17:49:07.178190 systemd[1]: Started cri-containerd-0ef5684a1e61cf7e0aa088d932128ce4dd90d2fc6ad5e1b175550d8b26d239ef.scope - libcontainer container 0ef5684a1e61cf7e0aa088d932128ce4dd90d2fc6ad5e1b175550d8b26d239ef. Mar 17 17:49:07.179656 systemd[1]: Started cri-containerd-8a0832c31655fa4807edda7635b81ec63bfdae43046f3393a43418d171efd303.scope - libcontainer container 8a0832c31655fa4807edda7635b81ec63bfdae43046f3393a43418d171efd303. Mar 17 17:49:07.212458 containerd[1483]: time="2025-03-17T17:49:07.212363926Z" level=info msg="StartContainer for \"be6c5a7e7ef6413b57d4ddfd843aa31fbb22194ad60ff063bd9384eea21ec110\" returns successfully" Mar 17 17:49:07.220711 containerd[1483]: time="2025-03-17T17:49:07.220671403Z" level=info msg="StartContainer for \"0ef5684a1e61cf7e0aa088d932128ce4dd90d2fc6ad5e1b175550d8b26d239ef\" returns successfully" Mar 17 17:49:07.228057 containerd[1483]: time="2025-03-17T17:49:07.227890534Z" level=info msg="StartContainer for \"8a0832c31655fa4807edda7635b81ec63bfdae43046f3393a43418d171efd303\" returns successfully" Mar 17 17:49:07.250325 kubelet[2200]: W0317 17:49:07.250257 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.115:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:07.250466 kubelet[2200]: E0317 17:49:07.250368 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.115:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:07.356707 kubelet[2200]: W0317 17:49:07.356519 2200 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.115:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.115:6443: connect: connection refused Mar 17 17:49:07.356707 kubelet[2200]: E0317 17:49:07.356591 2200 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.115:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.115:6443: connect: connection refused" logger="UnhandledError" Mar 17 17:49:07.380808 kubelet[2200]: E0317 17:49:07.380740 2200 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.115:6443: connect: connection refused" interval="1.6s" Mar 17 17:49:07.592686 kubelet[2200]: I0317 17:49:07.592583 2200 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 17 17:49:08.962195 kubelet[2200]: I0317 17:49:08.961963 2200 apiserver.go:52] "Watching apiserver" Mar 17 17:49:09.014290 kubelet[2200]: E0317 17:49:09.014246 2200 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 17 17:49:09.077828 kubelet[2200]: I0317 17:49:09.077790 2200 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 17 17:49:09.158753 kubelet[2200]: I0317 17:49:09.158713 2200 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Mar 17 17:49:11.195217 systemd[1]: Reload requested from client PID 2477 ('systemctl') (unit session-7.scope)... Mar 17 17:49:11.195239 systemd[1]: Reloading... Mar 17 17:49:11.265960 zram_generator::config[2527]: No configuration found. Mar 17 17:49:11.414068 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 17:49:11.497755 systemd[1]: Reloading finished in 302 ms. Mar 17 17:49:11.518049 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:49:11.529124 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 17:49:11.529371 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:49:11.529434 systemd[1]: kubelet.service: Consumed 1.590s CPU time, 117.8M memory peak. Mar 17 17:49:11.541233 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 17 17:49:11.646099 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 17 17:49:11.650837 (kubelet)[2562]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 17 17:49:11.690121 kubelet[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:49:11.690121 kubelet[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 17:49:11.690121 kubelet[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 17:49:11.691113 kubelet[2562]: I0317 17:49:11.690139 2562 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 17:49:11.697966 kubelet[2562]: I0317 17:49:11.697921 2562 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 17 17:49:11.697966 kubelet[2562]: I0317 17:49:11.697951 2562 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 17:49:11.698209 kubelet[2562]: I0317 17:49:11.698188 2562 server.go:929] "Client rotation is on, will bootstrap in background" Mar 17 17:49:11.699645 kubelet[2562]: I0317 17:49:11.699610 2562 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 17:49:11.702932 kubelet[2562]: I0317 17:49:11.702597 2562 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 17:49:11.705313 kubelet[2562]: E0317 17:49:11.705281 2562 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 17 17:49:11.705313 kubelet[2562]: I0317 17:49:11.705309 2562 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 17 17:49:11.708076 kubelet[2562]: I0317 17:49:11.708044 2562 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 17:49:11.708209 kubelet[2562]: I0317 17:49:11.708188 2562 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 17 17:49:11.708335 kubelet[2562]: I0317 17:49:11.708295 2562 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 17:49:11.708500 kubelet[2562]: I0317 17:49:11.708325 2562 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 17 17:49:11.708572 kubelet[2562]: I0317 17:49:11.708505 2562 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 17:49:11.708572 kubelet[2562]: I0317 17:49:11.708514 2562 container_manager_linux.go:300] "Creating device plugin manager" Mar 17 17:49:11.708572 kubelet[2562]: I0317 17:49:11.708546 2562 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:49:11.709126 kubelet[2562]: I0317 17:49:11.708647 2562 kubelet.go:408] "Attempting to sync node with API server" Mar 17 17:49:11.709126 kubelet[2562]: I0317 17:49:11.708665 2562 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 17:49:11.709126 kubelet[2562]: I0317 17:49:11.708687 2562 kubelet.go:314] "Adding apiserver pod source" Mar 17 17:49:11.709126 kubelet[2562]: I0317 17:49:11.708696 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 17:49:11.709433 kubelet[2562]: I0317 17:49:11.709411 2562 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Mar 17 17:49:11.710249 kubelet[2562]: I0317 17:49:11.709870 2562 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 17:49:11.712298 kubelet[2562]: I0317 17:49:11.712275 2562 server.go:1269] "Started kubelet" Mar 17 17:49:11.720367 kubelet[2562]: I0317 17:49:11.718264 2562 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 17:49:11.720797 kubelet[2562]: I0317 17:49:11.720672 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 17:49:11.721344 kubelet[2562]: I0317 17:49:11.721319 2562 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 17:49:11.721456 kubelet[2562]: I0317 17:49:11.721001 2562 server.go:460] "Adding debug handlers to kubelet server" Mar 17 17:49:11.722088 kubelet[2562]: I0317 17:49:11.722062 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 17:49:11.724141 kubelet[2562]: E0317 17:49:11.724111 2562 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 17:49:11.724689 kubelet[2562]: I0317 17:49:11.724655 2562 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 17 17:49:11.726623 kubelet[2562]: E0317 17:49:11.726578 2562 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 17:49:11.726746 kubelet[2562]: I0317 17:49:11.726734 2562 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 17 17:49:11.727034 kubelet[2562]: I0317 17:49:11.727003 2562 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 17 17:49:11.727240 kubelet[2562]: I0317 17:49:11.727227 2562 reconciler.go:26] "Reconciler: start to sync state" Mar 17 17:49:11.727709 kubelet[2562]: I0317 17:49:11.727673 2562 factory.go:221] Registration of the systemd container factory successfully Mar 17 17:49:11.727808 kubelet[2562]: I0317 17:49:11.727782 2562 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 17:49:11.729044 kubelet[2562]: I0317 17:49:11.729020 2562 factory.go:221] Registration of the containerd container factory successfully Mar 17 17:49:11.737101 kubelet[2562]: I0317 17:49:11.736347 2562 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 17:49:11.737453 kubelet[2562]: I0317 17:49:11.737411 2562 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 17:49:11.737453 kubelet[2562]: I0317 17:49:11.737441 2562 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 17:49:11.737519 kubelet[2562]: I0317 17:49:11.737458 2562 kubelet.go:2321] "Starting kubelet main sync loop" Mar 17 17:49:11.737577 kubelet[2562]: E0317 17:49:11.737517 2562 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 17:49:11.766183 kubelet[2562]: I0317 17:49:11.766083 2562 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 17:49:11.766183 kubelet[2562]: I0317 17:49:11.766103 2562 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 17:49:11.766183 kubelet[2562]: I0317 17:49:11.766126 2562 state_mem.go:36] "Initialized new in-memory state store" Mar 17 17:49:11.767121 kubelet[2562]: I0317 17:49:11.767098 2562 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 17:49:11.767199 kubelet[2562]: I0317 17:49:11.767119 2562 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 17:49:11.767199 kubelet[2562]: I0317 17:49:11.767138 2562 policy_none.go:49] "None policy: Start" Mar 17 17:49:11.767846 kubelet[2562]: I0317 17:49:11.767816 2562 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 17:49:11.767846 kubelet[2562]: I0317 17:49:11.767844 2562 state_mem.go:35] "Initializing new in-memory state store" Mar 17 17:49:11.768096 kubelet[2562]: I0317 17:49:11.768077 2562 state_mem.go:75] "Updated machine memory state" Mar 17 17:49:11.772418 kubelet[2562]: I0317 17:49:11.772318 2562 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 17:49:11.772514 kubelet[2562]: I0317 17:49:11.772499 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 17 17:49:11.772550 kubelet[2562]: I0317 17:49:11.772510 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 17:49:11.772739 kubelet[2562]: I0317 17:49:11.772722 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 17:49:11.874334 kubelet[2562]: I0317 17:49:11.874288 2562 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 17 17:49:11.881441 kubelet[2562]: I0317 17:49:11.881402 2562 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Mar 17 17:49:11.881573 kubelet[2562]: I0317 17:49:11.881493 2562 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Mar 17 17:49:11.927973 kubelet[2562]: I0317 17:49:11.927891 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f32907a07e55aea05abdc5cd284a8d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6f32907a07e55aea05abdc5cd284a8d5\") " pod="kube-system/kube-scheduler-localhost" Mar 17 17:49:12.029034 kubelet[2562]: I0317 17:49:12.028749 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1215b5bf44b06dfc18fa431008c5700d-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"1215b5bf44b06dfc18fa431008c5700d\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:49:12.029034 kubelet[2562]: I0317 17:49:12.028792 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:12.029034 kubelet[2562]: I0317 17:49:12.028808 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:12.029034 kubelet[2562]: I0317 17:49:12.028825 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:12.029034 kubelet[2562]: I0317 17:49:12.028844 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:12.029233 kubelet[2562]: I0317 17:49:12.028880 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1215b5bf44b06dfc18fa431008c5700d-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"1215b5bf44b06dfc18fa431008c5700d\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:49:12.029233 kubelet[2562]: I0317 17:49:12.028895 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1215b5bf44b06dfc18fa431008c5700d-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"1215b5bf44b06dfc18fa431008c5700d\") " pod="kube-system/kube-apiserver-localhost" Mar 17 17:49:12.029233 kubelet[2562]: I0317 17:49:12.028923 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 17:49:12.709120 kubelet[2562]: I0317 17:49:12.709085 2562 apiserver.go:52] "Watching apiserver" Mar 17 17:49:12.727375 kubelet[2562]: I0317 17:49:12.727332 2562 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 17 17:49:12.762843 kubelet[2562]: E0317 17:49:12.762776 2562 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 17 17:49:12.802303 kubelet[2562]: I0317 17:49:12.802228 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.802208145 podStartE2EDuration="1.802208145s" podCreationTimestamp="2025-03-17 17:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:49:12.786365799 +0000 UTC m=+1.131958522" watchObservedRunningTime="2025-03-17 17:49:12.802208145 +0000 UTC m=+1.147800908" Mar 17 17:49:12.802450 kubelet[2562]: I0317 17:49:12.802366 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.802360938 podStartE2EDuration="1.802360938s" podCreationTimestamp="2025-03-17 17:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:49:12.802295941 +0000 UTC m=+1.147888704" watchObservedRunningTime="2025-03-17 17:49:12.802360938 +0000 UTC m=+1.147953701" Mar 17 17:49:12.818356 kubelet[2562]: I0317 17:49:12.818291 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.818271401 podStartE2EDuration="1.818271401s" podCreationTimestamp="2025-03-17 17:49:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:49:12.81805773 +0000 UTC m=+1.163650493" watchObservedRunningTime="2025-03-17 17:49:12.818271401 +0000 UTC m=+1.163864164" Mar 17 17:49:16.358747 sudo[1671]: pam_unix(sudo:session): session closed for user root Mar 17 17:49:16.360016 sshd[1670]: Connection closed by 10.0.0.1 port 52292 Mar 17 17:49:16.360671 sshd-session[1667]: pam_unix(sshd:session): session closed for user core Mar 17 17:49:16.364573 systemd-logind[1469]: Session 7 logged out. Waiting for processes to exit. Mar 17 17:49:16.365175 systemd[1]: sshd@6-10.0.0.115:22-10.0.0.1:52292.service: Deactivated successfully. Mar 17 17:49:16.367455 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 17:49:16.367804 systemd[1]: session-7.scope: Consumed 6.713s CPU time, 215.9M memory peak. Mar 17 17:49:16.369350 systemd-logind[1469]: Removed session 7. Mar 17 17:49:18.048451 kubelet[2562]: I0317 17:49:18.048245 2562 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 17:49:18.049475 containerd[1483]: time="2025-03-17T17:49:18.048986065Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 17:49:18.049902 kubelet[2562]: I0317 17:49:18.049871 2562 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 17:49:18.915098 systemd[1]: Created slice kubepods-besteffort-pod47acd25c_5874_4a84_896c_912666f16f24.slice - libcontainer container kubepods-besteffort-pod47acd25c_5874_4a84_896c_912666f16f24.slice. Mar 17 17:49:18.976544 kubelet[2562]: I0317 17:49:18.976482 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/47acd25c-5874-4a84-896c-912666f16f24-xtables-lock\") pod \"kube-proxy-55trt\" (UID: \"47acd25c-5874-4a84-896c-912666f16f24\") " pod="kube-system/kube-proxy-55trt" Mar 17 17:49:18.976544 kubelet[2562]: I0317 17:49:18.976541 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/47acd25c-5874-4a84-896c-912666f16f24-kube-proxy\") pod \"kube-proxy-55trt\" (UID: \"47acd25c-5874-4a84-896c-912666f16f24\") " pod="kube-system/kube-proxy-55trt" Mar 17 17:49:18.976709 kubelet[2562]: I0317 17:49:18.976560 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47acd25c-5874-4a84-896c-912666f16f24-lib-modules\") pod \"kube-proxy-55trt\" (UID: \"47acd25c-5874-4a84-896c-912666f16f24\") " pod="kube-system/kube-proxy-55trt" Mar 17 17:49:18.976709 kubelet[2562]: I0317 17:49:18.976590 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-982bc\" (UniqueName: \"kubernetes.io/projected/47acd25c-5874-4a84-896c-912666f16f24-kube-api-access-982bc\") pod \"kube-proxy-55trt\" (UID: \"47acd25c-5874-4a84-896c-912666f16f24\") " pod="kube-system/kube-proxy-55trt" Mar 17 17:49:19.135336 systemd[1]: Created slice kubepods-besteffort-pod86fee659_fad9_40ba_a71c_0f593a14a0fa.slice - libcontainer container kubepods-besteffort-pod86fee659_fad9_40ba_a71c_0f593a14a0fa.slice. Mar 17 17:49:19.178079 kubelet[2562]: I0317 17:49:19.177962 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7s7\" (UniqueName: \"kubernetes.io/projected/86fee659-fad9-40ba-a71c-0f593a14a0fa-kube-api-access-sh7s7\") pod \"tigera-operator-64ff5465b7-g6pz5\" (UID: \"86fee659-fad9-40ba-a71c-0f593a14a0fa\") " pod="tigera-operator/tigera-operator-64ff5465b7-g6pz5" Mar 17 17:49:19.178704 kubelet[2562]: I0317 17:49:19.178203 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/86fee659-fad9-40ba-a71c-0f593a14a0fa-var-lib-calico\") pod \"tigera-operator-64ff5465b7-g6pz5\" (UID: \"86fee659-fad9-40ba-a71c-0f593a14a0fa\") " pod="tigera-operator/tigera-operator-64ff5465b7-g6pz5" Mar 17 17:49:19.226650 containerd[1483]: time="2025-03-17T17:49:19.226596009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-55trt,Uid:47acd25c-5874-4a84-896c-912666f16f24,Namespace:kube-system,Attempt:0,}" Mar 17 17:49:19.247708 containerd[1483]: time="2025-03-17T17:49:19.247258566Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:19.247708 containerd[1483]: time="2025-03-17T17:49:19.247674432Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:19.247708 containerd[1483]: time="2025-03-17T17:49:19.247687872Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:19.247900 containerd[1483]: time="2025-03-17T17:49:19.247804828Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:19.262113 systemd[1]: Started cri-containerd-effbdeb76c6f6f0a2a2ebc9ffe58f700a7af1f7aa89d55ac1765a60fc0cbb63c.scope - libcontainer container effbdeb76c6f6f0a2a2ebc9ffe58f700a7af1f7aa89d55ac1765a60fc0cbb63c. Mar 17 17:49:19.289302 containerd[1483]: time="2025-03-17T17:49:19.288791952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-55trt,Uid:47acd25c-5874-4a84-896c-912666f16f24,Namespace:kube-system,Attempt:0,} returns sandbox id \"effbdeb76c6f6f0a2a2ebc9ffe58f700a7af1f7aa89d55ac1765a60fc0cbb63c\"" Mar 17 17:49:19.295138 containerd[1483]: time="2025-03-17T17:49:19.295097423Z" level=info msg="CreateContainer within sandbox \"effbdeb76c6f6f0a2a2ebc9ffe58f700a7af1f7aa89d55ac1765a60fc0cbb63c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 17:49:19.309276 containerd[1483]: time="2025-03-17T17:49:19.309221796Z" level=info msg="CreateContainer within sandbox \"effbdeb76c6f6f0a2a2ebc9ffe58f700a7af1f7aa89d55ac1765a60fc0cbb63c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b3c8a7375079ec214cbe5077e2f6a43e1a7455c2d3ad746a63a2725c3ad3b881\"" Mar 17 17:49:19.309853 containerd[1483]: time="2025-03-17T17:49:19.309820096Z" level=info msg="StartContainer for \"b3c8a7375079ec214cbe5077e2f6a43e1a7455c2d3ad746a63a2725c3ad3b881\"" Mar 17 17:49:19.335107 systemd[1]: Started cri-containerd-b3c8a7375079ec214cbe5077e2f6a43e1a7455c2d3ad746a63a2725c3ad3b881.scope - libcontainer container b3c8a7375079ec214cbe5077e2f6a43e1a7455c2d3ad746a63a2725c3ad3b881. Mar 17 17:49:19.360202 containerd[1483]: time="2025-03-17T17:49:19.360154711Z" level=info msg="StartContainer for \"b3c8a7375079ec214cbe5077e2f6a43e1a7455c2d3ad746a63a2725c3ad3b881\" returns successfully" Mar 17 17:49:19.441438 containerd[1483]: time="2025-03-17T17:49:19.441112672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-g6pz5,Uid:86fee659-fad9-40ba-a71c-0f593a14a0fa,Namespace:tigera-operator,Attempt:0,}" Mar 17 17:49:19.480003 containerd[1483]: time="2025-03-17T17:49:19.479847631Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:19.480003 containerd[1483]: time="2025-03-17T17:49:19.479936748Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:19.480003 containerd[1483]: time="2025-03-17T17:49:19.479972147Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:19.480640 containerd[1483]: time="2025-03-17T17:49:19.480500009Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:19.502138 systemd[1]: Started cri-containerd-3d3361beecbd382ab0f940707620f958714f1cd8e6a7201dd7c02bc4663bfc09.scope - libcontainer container 3d3361beecbd382ab0f940707620f958714f1cd8e6a7201dd7c02bc4663bfc09. Mar 17 17:49:19.530043 containerd[1483]: time="2025-03-17T17:49:19.529445590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-g6pz5,Uid:86fee659-fad9-40ba-a71c-0f593a14a0fa,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3d3361beecbd382ab0f940707620f958714f1cd8e6a7201dd7c02bc4663bfc09\"" Mar 17 17:49:19.532222 containerd[1483]: time="2025-03-17T17:49:19.532029425Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 17 17:49:19.778257 kubelet[2562]: I0317 17:49:19.778000 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-55trt" podStartSLOduration=1.777981168 podStartE2EDuration="1.777981168s" podCreationTimestamp="2025-03-17 17:49:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:49:19.777139915 +0000 UTC m=+8.122732678" watchObservedRunningTime="2025-03-17 17:49:19.777981168 +0000 UTC m=+8.123573931" Mar 17 17:49:21.411089 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount131026677.mount: Deactivated successfully. Mar 17 17:49:21.652332 containerd[1483]: time="2025-03-17T17:49:21.651479661Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:21.652332 containerd[1483]: time="2025-03-17T17:49:21.652165800Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 17 17:49:21.652864 containerd[1483]: time="2025-03-17T17:49:21.652833259Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:21.655037 containerd[1483]: time="2025-03-17T17:49:21.654995632Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:21.656044 containerd[1483]: time="2025-03-17T17:49:21.656012881Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 2.123946497s" Mar 17 17:49:21.656110 containerd[1483]: time="2025-03-17T17:49:21.656044640Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 17 17:49:21.664051 containerd[1483]: time="2025-03-17T17:49:21.663685242Z" level=info msg="CreateContainer within sandbox \"3d3361beecbd382ab0f940707620f958714f1cd8e6a7201dd7c02bc4663bfc09\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 17:49:21.673020 containerd[1483]: time="2025-03-17T17:49:21.672969874Z" level=info msg="CreateContainer within sandbox \"3d3361beecbd382ab0f940707620f958714f1cd8e6a7201dd7c02bc4663bfc09\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"bb76753da2ccb15ee7b44ab1cc97d9b0d63f4fc142b8392203a2b0ad0059a9ba\"" Mar 17 17:49:21.673844 containerd[1483]: time="2025-03-17T17:49:21.673816968Z" level=info msg="StartContainer for \"bb76753da2ccb15ee7b44ab1cc97d9b0d63f4fc142b8392203a2b0ad0059a9ba\"" Mar 17 17:49:21.707112 systemd[1]: Started cri-containerd-bb76753da2ccb15ee7b44ab1cc97d9b0d63f4fc142b8392203a2b0ad0059a9ba.scope - libcontainer container bb76753da2ccb15ee7b44ab1cc97d9b0d63f4fc142b8392203a2b0ad0059a9ba. Mar 17 17:49:21.731347 containerd[1483]: time="2025-03-17T17:49:21.731295703Z" level=info msg="StartContainer for \"bb76753da2ccb15ee7b44ab1cc97d9b0d63f4fc142b8392203a2b0ad0059a9ba\" returns successfully" Mar 17 17:49:21.783691 kubelet[2562]: I0317 17:49:21.783630 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-g6pz5" podStartSLOduration=0.653492376 podStartE2EDuration="2.783614159s" podCreationTimestamp="2025-03-17 17:49:19 +0000 UTC" firstStartedPulling="2025-03-17 17:49:19.530994779 +0000 UTC m=+7.876587542" lastFinishedPulling="2025-03-17 17:49:21.661116602 +0000 UTC m=+10.006709325" observedRunningTime="2025-03-17 17:49:21.783486443 +0000 UTC m=+10.129079206" watchObservedRunningTime="2025-03-17 17:49:21.783614159 +0000 UTC m=+10.129206922" Mar 17 17:49:25.540530 systemd[1]: Created slice kubepods-besteffort-pod257112e3_3961_4fd4_8bed_270feb210dd4.slice - libcontainer container kubepods-besteffort-pod257112e3_3961_4fd4_8bed_270feb210dd4.slice. Mar 17 17:49:25.606513 systemd[1]: Created slice kubepods-besteffort-podce4efe66_1a47_4cd2_b360_f53f1740b88e.slice - libcontainer container kubepods-besteffort-podce4efe66_1a47_4cd2_b360_f53f1740b88e.slice. Mar 17 17:49:25.621551 kubelet[2562]: I0317 17:49:25.621454 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-flexvol-driver-host\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.621551 kubelet[2562]: I0317 17:49:25.621495 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce4efe66-1a47-4cd2-b360-f53f1740b88e-tigera-ca-bundle\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.621551 kubelet[2562]: I0317 17:49:25.621514 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-lib-modules\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.621994 kubelet[2562]: I0317 17:49:25.621554 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/257112e3-3961-4fd4-8bed-270feb210dd4-tigera-ca-bundle\") pod \"calico-typha-7f5d99b795-7gcc6\" (UID: \"257112e3-3961-4fd4-8bed-270feb210dd4\") " pod="calico-system/calico-typha-7f5d99b795-7gcc6" Mar 17 17:49:25.621994 kubelet[2562]: I0317 17:49:25.621589 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/257112e3-3961-4fd4-8bed-270feb210dd4-typha-certs\") pod \"calico-typha-7f5d99b795-7gcc6\" (UID: \"257112e3-3961-4fd4-8bed-270feb210dd4\") " pod="calico-system/calico-typha-7f5d99b795-7gcc6" Mar 17 17:49:25.621994 kubelet[2562]: I0317 17:49:25.621622 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl6l7\" (UniqueName: \"kubernetes.io/projected/ce4efe66-1a47-4cd2-b360-f53f1740b88e-kube-api-access-bl6l7\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.621994 kubelet[2562]: I0317 17:49:25.621641 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-var-run-calico\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.621994 kubelet[2562]: I0317 17:49:25.621657 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-var-lib-calico\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.622099 kubelet[2562]: I0317 17:49:25.621674 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-xtables-lock\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.622099 kubelet[2562]: I0317 17:49:25.621689 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-policysync\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.622099 kubelet[2562]: I0317 17:49:25.621702 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ce4efe66-1a47-4cd2-b360-f53f1740b88e-node-certs\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.622099 kubelet[2562]: I0317 17:49:25.621718 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-cni-log-dir\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.622099 kubelet[2562]: I0317 17:49:25.621734 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-cni-net-dir\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.622199 kubelet[2562]: I0317 17:49:25.621750 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b54kt\" (UniqueName: \"kubernetes.io/projected/257112e3-3961-4fd4-8bed-270feb210dd4-kube-api-access-b54kt\") pod \"calico-typha-7f5d99b795-7gcc6\" (UID: \"257112e3-3961-4fd4-8bed-270feb210dd4\") " pod="calico-system/calico-typha-7f5d99b795-7gcc6" Mar 17 17:49:25.622199 kubelet[2562]: I0317 17:49:25.621766 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ce4efe66-1a47-4cd2-b360-f53f1740b88e-cni-bin-dir\") pod \"calico-node-hznnh\" (UID: \"ce4efe66-1a47-4cd2-b360-f53f1740b88e\") " pod="calico-system/calico-node-hznnh" Mar 17 17:49:25.711392 kubelet[2562]: E0317 17:49:25.711339 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:25.729592 kubelet[2562]: E0317 17:49:25.729415 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.729592 kubelet[2562]: W0317 17:49:25.729453 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.729592 kubelet[2562]: E0317 17:49:25.729475 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.730236 kubelet[2562]: E0317 17:49:25.730138 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.730236 kubelet[2562]: W0317 17:49:25.730153 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.732916 kubelet[2562]: E0317 17:49:25.730395 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.732916 kubelet[2562]: E0317 17:49:25.730818 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.732916 kubelet[2562]: W0317 17:49:25.730844 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.733323 kubelet[2562]: E0317 17:49:25.733279 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.733323 kubelet[2562]: W0317 17:49:25.733310 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.733839 kubelet[2562]: E0317 17:49:25.733809 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.733839 kubelet[2562]: W0317 17:49:25.733829 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.737915 kubelet[2562]: E0317 17:49:25.734854 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.738017 kubelet[2562]: E0317 17:49:25.736225 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.738090 kubelet[2562]: W0317 17:49:25.738076 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.738658 kubelet[2562]: E0317 17:49:25.738644 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.738741 kubelet[2562]: W0317 17:49:25.738728 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.739035 kubelet[2562]: E0317 17:49:25.739022 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.739108 kubelet[2562]: W0317 17:49:25.739096 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.740620 kubelet[2562]: E0317 17:49:25.740578 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.740662 kubelet[2562]: E0317 17:49:25.736237 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.740662 kubelet[2562]: E0317 17:49:25.736246 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.740662 kubelet[2562]: E0317 17:49:25.740640 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.740740 kubelet[2562]: E0317 17:49:25.740667 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.741320 kubelet[2562]: E0317 17:49:25.741215 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.741320 kubelet[2562]: W0317 17:49:25.741230 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.741465 kubelet[2562]: E0317 17:49:25.741451 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.741615 kubelet[2562]: W0317 17:49:25.741517 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.741730 kubelet[2562]: E0317 17:49:25.741716 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.741780 kubelet[2562]: W0317 17:49:25.741770 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.741991 kubelet[2562]: E0317 17:49:25.741976 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.742088 kubelet[2562]: E0317 17:49:25.742007 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.742667 kubelet[2562]: W0317 17:49:25.742255 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.742767 kubelet[2562]: E0317 17:49:25.742027 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.742799 kubelet[2562]: E0317 17:49:25.742020 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.742799 kubelet[2562]: E0317 17:49:25.742792 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.744579 kubelet[2562]: E0317 17:49:25.744552 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.744761 kubelet[2562]: W0317 17:49:25.744660 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.744948 kubelet[2562]: E0317 17:49:25.744901 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.745090 kubelet[2562]: E0317 17:49:25.745076 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.745148 kubelet[2562]: W0317 17:49:25.745137 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.745264 kubelet[2562]: E0317 17:49:25.745237 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.745430 kubelet[2562]: E0317 17:49:25.745416 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.745566 kubelet[2562]: W0317 17:49:25.745486 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.745566 kubelet[2562]: E0317 17:49:25.745554 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.745737 kubelet[2562]: E0317 17:49:25.745724 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.745998 kubelet[2562]: W0317 17:49:25.745981 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.746228 kubelet[2562]: E0317 17:49:25.746183 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.747088 kubelet[2562]: E0317 17:49:25.746975 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.747088 kubelet[2562]: W0317 17:49:25.747008 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.747346 kubelet[2562]: E0317 17:49:25.747333 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.747497 kubelet[2562]: W0317 17:49:25.747406 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.748412 kubelet[2562]: E0317 17:49:25.748377 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.748412 kubelet[2562]: E0317 17:49:25.748400 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.749093 kubelet[2562]: E0317 17:49:25.749081 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.749205 kubelet[2562]: W0317 17:49:25.749155 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.749291 kubelet[2562]: E0317 17:49:25.749272 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.749639 kubelet[2562]: E0317 17:49:25.749509 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.749639 kubelet[2562]: W0317 17:49:25.749520 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.749761 kubelet[2562]: E0317 17:49:25.749746 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.749862 kubelet[2562]: E0317 17:49:25.749852 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.750138 kubelet[2562]: W0317 17:49:25.750084 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.751350 kubelet[2562]: E0317 17:49:25.751256 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.751350 kubelet[2562]: W0317 17:49:25.751270 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.751513 kubelet[2562]: E0317 17:49:25.751501 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.751577 kubelet[2562]: W0317 17:49:25.751566 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.751651 kubelet[2562]: E0317 17:49:25.751625 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.751687 kubelet[2562]: E0317 17:49:25.751664 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.751773 kubelet[2562]: E0317 17:49:25.751717 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.751894 kubelet[2562]: E0317 17:49:25.751882 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.751970 kubelet[2562]: W0317 17:49:25.751948 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.752073 kubelet[2562]: E0317 17:49:25.752035 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.752438 kubelet[2562]: E0317 17:49:25.752334 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.752438 kubelet[2562]: W0317 17:49:25.752347 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.753735 kubelet[2562]: E0317 17:49:25.753488 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.753940 kubelet[2562]: E0317 17:49:25.753924 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.754028 kubelet[2562]: W0317 17:49:25.754003 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.754414 kubelet[2562]: E0317 17:49:25.754388 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.756183 kubelet[2562]: E0317 17:49:25.756164 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.756349 kubelet[2562]: W0317 17:49:25.756249 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.756544 kubelet[2562]: E0317 17:49:25.756465 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.756544 kubelet[2562]: W0317 17:49:25.756477 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.757085 kubelet[2562]: E0317 17:49:25.757002 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.757085 kubelet[2562]: W0317 17:49:25.757015 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.757085 kubelet[2562]: E0317 17:49:25.757020 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.757085 kubelet[2562]: E0317 17:49:25.757055 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.757085 kubelet[2562]: E0317 17:49:25.757077 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.757711 kubelet[2562]: E0317 17:49:25.757582 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.757711 kubelet[2562]: W0317 17:49:25.757595 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.757711 kubelet[2562]: E0317 17:49:25.757639 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.757859 kubelet[2562]: E0317 17:49:25.757847 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.757916 kubelet[2562]: W0317 17:49:25.757896 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.758022 kubelet[2562]: E0317 17:49:25.757997 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.758363 kubelet[2562]: E0317 17:49:25.758351 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.758430 kubelet[2562]: W0317 17:49:25.758418 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.759930 kubelet[2562]: E0317 17:49:25.758502 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.761104 kubelet[2562]: E0317 17:49:25.761078 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.761104 kubelet[2562]: W0317 17:49:25.761096 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.761589 kubelet[2562]: E0317 17:49:25.761576 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.761589 kubelet[2562]: W0317 17:49:25.761589 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.761714 kubelet[2562]: E0317 17:49:25.761567 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.761741 kubelet[2562]: E0317 17:49:25.761720 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.761741 kubelet[2562]: E0317 17:49:25.761736 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.761786 kubelet[2562]: W0317 17:49:25.761744 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.762341 kubelet[2562]: E0317 17:49:25.762315 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.763313 kubelet[2562]: E0317 17:49:25.763294 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.763313 kubelet[2562]: W0317 17:49:25.763308 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.763493 kubelet[2562]: E0317 17:49:25.763479 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.763493 kubelet[2562]: W0317 17:49:25.763490 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.763684 kubelet[2562]: E0317 17:49:25.763672 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.763684 kubelet[2562]: W0317 17:49:25.763685 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.763879 kubelet[2562]: E0317 17:49:25.763868 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.764007 kubelet[2562]: W0317 17:49:25.763880 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.764007 kubelet[2562]: E0317 17:49:25.763891 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.764007 kubelet[2562]: E0317 17:49:25.763937 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.764007 kubelet[2562]: E0317 17:49:25.763950 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.764177 kubelet[2562]: E0317 17:49:25.764158 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.774039 kubelet[2562]: E0317 17:49:25.774001 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.774039 kubelet[2562]: W0317 17:49:25.774021 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.776160 kubelet[2562]: E0317 17:49:25.775985 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.776885 kubelet[2562]: E0317 17:49:25.776753 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.776885 kubelet[2562]: W0317 17:49:25.776767 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.778620 kubelet[2562]: E0317 17:49:25.776783 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.781964 kubelet[2562]: E0317 17:49:25.781944 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.781964 kubelet[2562]: W0317 17:49:25.781962 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.782090 kubelet[2562]: E0317 17:49:25.782002 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.812312 kubelet[2562]: E0317 17:49:25.812195 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.812312 kubelet[2562]: W0317 17:49:25.812226 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.812312 kubelet[2562]: E0317 17:49:25.812245 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.813239 kubelet[2562]: E0317 17:49:25.813214 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.813375 kubelet[2562]: W0317 17:49:25.813233 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.813414 kubelet[2562]: E0317 17:49:25.813379 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.814167 kubelet[2562]: E0317 17:49:25.814120 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.814167 kubelet[2562]: W0317 17:49:25.814138 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.814720 kubelet[2562]: E0317 17:49:25.814150 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.815151 kubelet[2562]: E0317 17:49:25.815094 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.815151 kubelet[2562]: W0317 17:49:25.815114 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.815151 kubelet[2562]: E0317 17:49:25.815126 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.816748 kubelet[2562]: E0317 17:49:25.815300 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.816748 kubelet[2562]: W0317 17:49:25.815311 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.816748 kubelet[2562]: E0317 17:49:25.815319 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.817092 kubelet[2562]: E0317 17:49:25.817073 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.817092 kubelet[2562]: W0317 17:49:25.817086 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.817092 kubelet[2562]: E0317 17:49:25.817096 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823163 kubelet[2562]: E0317 17:49:25.817498 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823163 kubelet[2562]: W0317 17:49:25.817509 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823163 kubelet[2562]: E0317 17:49:25.817518 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823163 kubelet[2562]: E0317 17:49:25.817699 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823163 kubelet[2562]: W0317 17:49:25.817707 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823163 kubelet[2562]: E0317 17:49:25.817715 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823163 kubelet[2562]: E0317 17:49:25.817870 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823163 kubelet[2562]: W0317 17:49:25.817878 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823163 kubelet[2562]: E0317 17:49:25.817886 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823163 kubelet[2562]: E0317 17:49:25.818037 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823417 kubelet[2562]: W0317 17:49:25.818044 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823417 kubelet[2562]: E0317 17:49:25.818051 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823417 kubelet[2562]: E0317 17:49:25.818204 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823417 kubelet[2562]: W0317 17:49:25.818211 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823417 kubelet[2562]: E0317 17:49:25.818220 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823417 kubelet[2562]: E0317 17:49:25.818871 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823417 kubelet[2562]: W0317 17:49:25.818891 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823417 kubelet[2562]: E0317 17:49:25.818919 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823417 kubelet[2562]: E0317 17:49:25.819114 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823417 kubelet[2562]: W0317 17:49:25.819124 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823612 kubelet[2562]: E0317 17:49:25.819133 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823612 kubelet[2562]: E0317 17:49:25.819298 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823612 kubelet[2562]: W0317 17:49:25.819305 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823612 kubelet[2562]: E0317 17:49:25.819313 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823612 kubelet[2562]: E0317 17:49:25.819511 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823612 kubelet[2562]: W0317 17:49:25.819519 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823612 kubelet[2562]: E0317 17:49:25.819528 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823612 kubelet[2562]: E0317 17:49:25.819737 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823612 kubelet[2562]: W0317 17:49:25.819747 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823612 kubelet[2562]: E0317 17:49:25.819758 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823810 kubelet[2562]: E0317 17:49:25.819953 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823810 kubelet[2562]: W0317 17:49:25.819963 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823810 kubelet[2562]: E0317 17:49:25.819972 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823810 kubelet[2562]: E0317 17:49:25.820532 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823810 kubelet[2562]: W0317 17:49:25.820546 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823810 kubelet[2562]: E0317 17:49:25.820556 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823810 kubelet[2562]: E0317 17:49:25.820802 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.823810 kubelet[2562]: W0317 17:49:25.820813 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.823810 kubelet[2562]: E0317 17:49:25.820832 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.823810 kubelet[2562]: E0317 17:49:25.821005 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.824447 kubelet[2562]: W0317 17:49:25.821012 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.824447 kubelet[2562]: E0317 17:49:25.821021 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.824447 kubelet[2562]: E0317 17:49:25.821192 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.824447 kubelet[2562]: W0317 17:49:25.821200 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.824447 kubelet[2562]: E0317 17:49:25.821208 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.824447 kubelet[2562]: E0317 17:49:25.821348 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.824447 kubelet[2562]: W0317 17:49:25.821355 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.824447 kubelet[2562]: E0317 17:49:25.821362 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.824447 kubelet[2562]: E0317 17:49:25.821492 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.824447 kubelet[2562]: W0317 17:49:25.821499 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.825433 kubelet[2562]: E0317 17:49:25.821506 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.825433 kubelet[2562]: E0317 17:49:25.821642 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.825433 kubelet[2562]: W0317 17:49:25.821650 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.825433 kubelet[2562]: E0317 17:49:25.821657 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.825433 kubelet[2562]: E0317 17:49:25.821804 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.825433 kubelet[2562]: W0317 17:49:25.821812 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.825433 kubelet[2562]: E0317 17:49:25.821819 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.825433 kubelet[2562]: E0317 17:49:25.823886 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.825433 kubelet[2562]: W0317 17:49:25.823934 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.825433 kubelet[2562]: E0317 17:49:25.823950 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.825668 kubelet[2562]: I0317 17:49:25.823980 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxv5\" (UniqueName: \"kubernetes.io/projected/1bd3d1fb-9cd7-48fa-86bf-ebd468d40871-kube-api-access-fdxv5\") pod \"csi-node-driver-zkh2s\" (UID: \"1bd3d1fb-9cd7-48fa-86bf-ebd468d40871\") " pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:25.825668 kubelet[2562]: E0317 17:49:25.824238 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.825668 kubelet[2562]: W0317 17:49:25.824250 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.825668 kubelet[2562]: E0317 17:49:25.824294 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.825668 kubelet[2562]: I0317 17:49:25.824313 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bd3d1fb-9cd7-48fa-86bf-ebd468d40871-socket-dir\") pod \"csi-node-driver-zkh2s\" (UID: \"1bd3d1fb-9cd7-48fa-86bf-ebd468d40871\") " pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:25.825668 kubelet[2562]: E0317 17:49:25.824557 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.825668 kubelet[2562]: W0317 17:49:25.824570 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.825668 kubelet[2562]: E0317 17:49:25.824587 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.825845 kubelet[2562]: I0317 17:49:25.824611 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bd3d1fb-9cd7-48fa-86bf-ebd468d40871-registration-dir\") pod \"csi-node-driver-zkh2s\" (UID: \"1bd3d1fb-9cd7-48fa-86bf-ebd468d40871\") " pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:25.825845 kubelet[2562]: E0317 17:49:25.824880 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.825845 kubelet[2562]: W0317 17:49:25.824895 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.825845 kubelet[2562]: E0317 17:49:25.824988 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.825845 kubelet[2562]: I0317 17:49:25.825010 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/1bd3d1fb-9cd7-48fa-86bf-ebd468d40871-varrun\") pod \"csi-node-driver-zkh2s\" (UID: \"1bd3d1fb-9cd7-48fa-86bf-ebd468d40871\") " pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:25.826070 kubelet[2562]: E0317 17:49:25.826055 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.826070 kubelet[2562]: W0317 17:49:25.826068 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.826137 kubelet[2562]: E0317 17:49:25.826083 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.826317 kubelet[2562]: E0317 17:49:25.826303 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.826348 kubelet[2562]: W0317 17:49:25.826320 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.826381 kubelet[2562]: E0317 17:49:25.826351 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.826578 kubelet[2562]: E0317 17:49:25.826564 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.826614 kubelet[2562]: W0317 17:49:25.826578 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.826753 kubelet[2562]: E0317 17:49:25.826689 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.827011 kubelet[2562]: E0317 17:49:25.826996 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.827098 kubelet[2562]: W0317 17:49:25.827012 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.827188 kubelet[2562]: E0317 17:49:25.827163 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.827430 kubelet[2562]: E0317 17:49:25.827358 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.827465 kubelet[2562]: W0317 17:49:25.827430 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.827689 kubelet[2562]: E0317 17:49:25.827632 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.827845 kubelet[2562]: I0317 17:49:25.827829 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1bd3d1fb-9cd7-48fa-86bf-ebd468d40871-kubelet-dir\") pod \"csi-node-driver-zkh2s\" (UID: \"1bd3d1fb-9cd7-48fa-86bf-ebd468d40871\") " pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:25.828131 kubelet[2562]: E0317 17:49:25.828116 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.828235 kubelet[2562]: W0317 17:49:25.828131 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.828353 kubelet[2562]: E0317 17:49:25.828338 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.828722 kubelet[2562]: E0317 17:49:25.828658 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.828722 kubelet[2562]: W0317 17:49:25.828674 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.829090 kubelet[2562]: E0317 17:49:25.828686 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.829311 kubelet[2562]: E0317 17:49:25.829241 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.829393 kubelet[2562]: W0317 17:49:25.829313 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.829426 kubelet[2562]: E0317 17:49:25.829332 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.829624 kubelet[2562]: E0317 17:49:25.829597 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.829837 kubelet[2562]: W0317 17:49:25.829625 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.829837 kubelet[2562]: E0317 17:49:25.829640 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.829993 kubelet[2562]: E0317 17:49:25.829984 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.830020 kubelet[2562]: W0317 17:49:25.829993 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.830020 kubelet[2562]: E0317 17:49:25.830003 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.830468 kubelet[2562]: E0317 17:49:25.830233 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.830468 kubelet[2562]: W0317 17:49:25.830244 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.830468 kubelet[2562]: E0317 17:49:25.830253 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.844943 containerd[1483]: time="2025-03-17T17:49:25.844888522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7f5d99b795-7gcc6,Uid:257112e3-3961-4fd4-8bed-270feb210dd4,Namespace:calico-system,Attempt:0,}" Mar 17 17:49:25.913092 containerd[1483]: time="2025-03-17T17:49:25.913039739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hznnh,Uid:ce4efe66-1a47-4cd2-b360-f53f1740b88e,Namespace:calico-system,Attempt:0,}" Mar 17 17:49:25.929433 kubelet[2562]: E0317 17:49:25.929407 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.930052 kubelet[2562]: W0317 17:49:25.929614 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.930052 kubelet[2562]: E0317 17:49:25.929640 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.930380 kubelet[2562]: E0317 17:49:25.930366 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.930474 kubelet[2562]: W0317 17:49:25.930454 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.930535 kubelet[2562]: E0317 17:49:25.930524 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.930839 kubelet[2562]: E0317 17:49:25.930827 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.931010 kubelet[2562]: W0317 17:49:25.930937 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.931010 kubelet[2562]: E0317 17:49:25.930964 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.931329 kubelet[2562]: E0317 17:49:25.931316 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.931463 kubelet[2562]: W0317 17:49:25.931401 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.931463 kubelet[2562]: E0317 17:49:25.931421 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.932032 kubelet[2562]: E0317 17:49:25.931963 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.932032 kubelet[2562]: W0317 17:49:25.931983 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.932168 kubelet[2562]: E0317 17:49:25.932070 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.932550 kubelet[2562]: E0317 17:49:25.932460 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.932550 kubelet[2562]: W0317 17:49:25.932472 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.932734 kubelet[2562]: E0317 17:49:25.932625 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.932952 kubelet[2562]: E0317 17:49:25.932838 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.932952 kubelet[2562]: W0317 17:49:25.932848 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.932952 kubelet[2562]: E0317 17:49:25.932879 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.933551 kubelet[2562]: E0317 17:49:25.933200 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.933551 kubelet[2562]: W0317 17:49:25.933212 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.933551 kubelet[2562]: E0317 17:49:25.933225 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.933742 kubelet[2562]: E0317 17:49:25.933729 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.933800 kubelet[2562]: W0317 17:49:25.933789 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.933861 kubelet[2562]: E0317 17:49:25.933851 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.934224 kubelet[2562]: E0317 17:49:25.934211 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.934311 kubelet[2562]: W0317 17:49:25.934298 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.934368 kubelet[2562]: E0317 17:49:25.934358 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.934635 kubelet[2562]: E0317 17:49:25.934594 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.934635 kubelet[2562]: W0317 17:49:25.934615 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.935022 kubelet[2562]: E0317 17:49:25.934844 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.935022 kubelet[2562]: E0317 17:49:25.934934 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.935022 kubelet[2562]: W0317 17:49:25.934942 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.935022 kubelet[2562]: E0317 17:49:25.935002 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.935215 kubelet[2562]: E0317 17:49:25.935203 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.935266 kubelet[2562]: W0317 17:49:25.935255 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.935381 kubelet[2562]: E0317 17:49:25.935321 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.935763 kubelet[2562]: E0317 17:49:25.935728 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.935763 kubelet[2562]: W0317 17:49:25.935742 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.936303 kubelet[2562]: E0317 17:49:25.936288 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.938940 kubelet[2562]: E0317 17:49:25.936408 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.938940 kubelet[2562]: W0317 17:49:25.936419 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.938940 kubelet[2562]: E0317 17:49:25.936535 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.940360 kubelet[2562]: E0317 17:49:25.940236 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.940360 kubelet[2562]: W0317 17:49:25.940251 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.940360 kubelet[2562]: E0317 17:49:25.940330 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.942009 kubelet[2562]: E0317 17:49:25.941874 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.942009 kubelet[2562]: W0317 17:49:25.941891 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.942173 kubelet[2562]: E0317 17:49:25.942161 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.942300 kubelet[2562]: W0317 17:49:25.942219 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.942361 kubelet[2562]: E0317 17:49:25.942340 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.942448 kubelet[2562]: E0317 17:49:25.942418 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.942514 kubelet[2562]: E0317 17:49:25.942504 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.942568 kubelet[2562]: W0317 17:49:25.942554 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.942692 kubelet[2562]: E0317 17:49:25.942671 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.943833 kubelet[2562]: E0317 17:49:25.943817 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.944642 kubelet[2562]: W0317 17:49:25.943921 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.944642 kubelet[2562]: E0317 17:49:25.943968 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.944642 kubelet[2562]: E0317 17:49:25.944233 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.944642 kubelet[2562]: W0317 17:49:25.944242 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.944642 kubelet[2562]: E0317 17:49:25.944310 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.944642 kubelet[2562]: E0317 17:49:25.944429 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.944642 kubelet[2562]: W0317 17:49:25.944438 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.944642 kubelet[2562]: E0317 17:49:25.944502 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.945360 kubelet[2562]: E0317 17:49:25.944841 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.945360 kubelet[2562]: W0317 17:49:25.944853 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.945360 kubelet[2562]: E0317 17:49:25.944898 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.945360 kubelet[2562]: E0317 17:49:25.945064 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.945360 kubelet[2562]: W0317 17:49:25.945076 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.945360 kubelet[2562]: E0317 17:49:25.945089 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.945578 kubelet[2562]: E0317 17:49:25.945534 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.945578 kubelet[2562]: W0317 17:49:25.945547 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.945578 kubelet[2562]: E0317 17:49:25.945557 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.946378 kubelet[2562]: E0317 17:49:25.946334 2562 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 17:49:25.946378 kubelet[2562]: W0317 17:49:25.946356 2562 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 17:49:25.946378 kubelet[2562]: E0317 17:49:25.946369 2562 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 17:49:25.949704 containerd[1483]: time="2025-03-17T17:49:25.949444703Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:25.949704 containerd[1483]: time="2025-03-17T17:49:25.949583259Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:25.949704 containerd[1483]: time="2025-03-17T17:49:25.949630898Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:25.950014 containerd[1483]: time="2025-03-17T17:49:25.949720496Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:25.950784 containerd[1483]: time="2025-03-17T17:49:25.950717308Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:25.950863 containerd[1483]: time="2025-03-17T17:49:25.950792706Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:25.950863 containerd[1483]: time="2025-03-17T17:49:25.950810906Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:25.951028 containerd[1483]: time="2025-03-17T17:49:25.950987421Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:25.975104 systemd[1]: Started cri-containerd-d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779.scope - libcontainer container d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779. Mar 17 17:49:25.978679 systemd[1]: Started cri-containerd-d88e27a8f5f5d836528fb868013c1e010cc77824ff9ced745a16d346f84ff8d1.scope - libcontainer container d88e27a8f5f5d836528fb868013c1e010cc77824ff9ced745a16d346f84ff8d1. Mar 17 17:49:26.004443 containerd[1483]: time="2025-03-17T17:49:26.004152090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hznnh,Uid:ce4efe66-1a47-4cd2-b360-f53f1740b88e,Namespace:calico-system,Attempt:0,} returns sandbox id \"d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779\"" Mar 17 17:49:26.021748 containerd[1483]: time="2025-03-17T17:49:26.021691585Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 17 17:49:26.038475 containerd[1483]: time="2025-03-17T17:49:26.038186988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7f5d99b795-7gcc6,Uid:257112e3-3961-4fd4-8bed-270feb210dd4,Namespace:calico-system,Attempt:0,} returns sandbox id \"d88e27a8f5f5d836528fb868013c1e010cc77824ff9ced745a16d346f84ff8d1\"" Mar 17 17:49:26.082623 update_engine[1471]: I20250317 17:49:26.080024 1471 update_attempter.cc:509] Updating boot flags... Mar 17 17:49:26.129011 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (3166) Mar 17 17:49:26.204080 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (3164) Mar 17 17:49:26.261985 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (3164) Mar 17 17:49:26.738535 kubelet[2562]: E0317 17:49:26.738080 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:27.066406 containerd[1483]: time="2025-03-17T17:49:27.066360405Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:27.067378 containerd[1483]: time="2025-03-17T17:49:27.067170984Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 17 17:49:27.068312 containerd[1483]: time="2025-03-17T17:49:27.068138960Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:27.070262 containerd[1483]: time="2025-03-17T17:49:27.070223186Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:27.071232 containerd[1483]: time="2025-03-17T17:49:27.071104364Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.04936598s" Mar 17 17:49:27.071232 containerd[1483]: time="2025-03-17T17:49:27.071135163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 17 17:49:27.074813 containerd[1483]: time="2025-03-17T17:49:27.074781829Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 17 17:49:27.085704 containerd[1483]: time="2025-03-17T17:49:27.085669470Z" level=info msg="CreateContainer within sandbox \"d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 17:49:27.097049 containerd[1483]: time="2025-03-17T17:49:27.096935901Z" level=info msg="CreateContainer within sandbox \"d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a\"" Mar 17 17:49:27.098988 containerd[1483]: time="2025-03-17T17:49:27.098960649Z" level=info msg="StartContainer for \"bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a\"" Mar 17 17:49:27.131092 systemd[1]: Started cri-containerd-bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a.scope - libcontainer container bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a. Mar 17 17:49:27.157805 containerd[1483]: time="2025-03-17T17:49:27.157701061Z" level=info msg="StartContainer for \"bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a\" returns successfully" Mar 17 17:49:27.191174 systemd[1]: cri-containerd-bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a.scope: Deactivated successfully. Mar 17 17:49:27.220272 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a-rootfs.mount: Deactivated successfully. Mar 17 17:49:27.239243 containerd[1483]: time="2025-03-17T17:49:27.233599553Z" level=info msg="shim disconnected" id=bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a namespace=k8s.io Mar 17 17:49:27.239243 containerd[1483]: time="2025-03-17T17:49:27.239120132Z" level=warning msg="cleaning up after shim disconnected" id=bf731d6f63f3040f9c3b59765bd8d9ec697a1883d8924945f1e7bd876600701a namespace=k8s.io Mar 17 17:49:27.239243 containerd[1483]: time="2025-03-17T17:49:27.239134291Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:49:28.602835 containerd[1483]: time="2025-03-17T17:49:28.602784178Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:28.604202 containerd[1483]: time="2025-03-17T17:49:28.604021427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 17 17:49:28.610103 containerd[1483]: time="2025-03-17T17:49:28.610062717Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:28.612497 containerd[1483]: time="2025-03-17T17:49:28.612425538Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:28.613130 containerd[1483]: time="2025-03-17T17:49:28.613099082Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 1.538287853s" Mar 17 17:49:28.613181 containerd[1483]: time="2025-03-17T17:49:28.613129361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 17 17:49:28.614385 containerd[1483]: time="2025-03-17T17:49:28.614358890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 17 17:49:28.622645 containerd[1483]: time="2025-03-17T17:49:28.622603125Z" level=info msg="CreateContainer within sandbox \"d88e27a8f5f5d836528fb868013c1e010cc77824ff9ced745a16d346f84ff8d1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 17:49:28.637559 containerd[1483]: time="2025-03-17T17:49:28.637502275Z" level=info msg="CreateContainer within sandbox \"d88e27a8f5f5d836528fb868013c1e010cc77824ff9ced745a16d346f84ff8d1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"18a97471752a96a1294ac8b58929897ea7ef130a3894f137d2c171e5d6471e82\"" Mar 17 17:49:28.638098 containerd[1483]: time="2025-03-17T17:49:28.638064741Z" level=info msg="StartContainer for \"18a97471752a96a1294ac8b58929897ea7ef130a3894f137d2c171e5d6471e82\"" Mar 17 17:49:28.664110 systemd[1]: Started cri-containerd-18a97471752a96a1294ac8b58929897ea7ef130a3894f137d2c171e5d6471e82.scope - libcontainer container 18a97471752a96a1294ac8b58929897ea7ef130a3894f137d2c171e5d6471e82. Mar 17 17:49:28.696920 containerd[1483]: time="2025-03-17T17:49:28.696850959Z" level=info msg="StartContainer for \"18a97471752a96a1294ac8b58929897ea7ef130a3894f137d2c171e5d6471e82\" returns successfully" Mar 17 17:49:28.738714 kubelet[2562]: E0317 17:49:28.738632 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:28.807299 kubelet[2562]: I0317 17:49:28.807243 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7f5d99b795-7gcc6" podStartSLOduration=1.232657111 podStartE2EDuration="3.807224935s" podCreationTimestamp="2025-03-17 17:49:25 +0000 UTC" firstStartedPulling="2025-03-17 17:49:26.039353397 +0000 UTC m=+14.384946120" lastFinishedPulling="2025-03-17 17:49:28.613921181 +0000 UTC m=+16.959513944" observedRunningTime="2025-03-17 17:49:28.806156642 +0000 UTC m=+17.151749405" watchObservedRunningTime="2025-03-17 17:49:28.807224935 +0000 UTC m=+17.152817698" Mar 17 17:49:29.807196 kubelet[2562]: I0317 17:49:29.807142 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:49:30.738123 kubelet[2562]: E0317 17:49:30.738073 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:32.290315 containerd[1483]: time="2025-03-17T17:49:32.290258766Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:32.290936 containerd[1483]: time="2025-03-17T17:49:32.290879433Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 17 17:49:32.292321 containerd[1483]: time="2025-03-17T17:49:32.292288682Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:32.294515 containerd[1483]: time="2025-03-17T17:49:32.294447394Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:32.295316 containerd[1483]: time="2025-03-17T17:49:32.295277416Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 3.680884207s" Mar 17 17:49:32.295490 containerd[1483]: time="2025-03-17T17:49:32.295398054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 17 17:49:32.297738 containerd[1483]: time="2025-03-17T17:49:32.297702523Z" level=info msg="CreateContainer within sandbox \"d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 17:49:32.310111 containerd[1483]: time="2025-03-17T17:49:32.309982414Z" level=info msg="CreateContainer within sandbox \"d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50\"" Mar 17 17:49:32.315307 containerd[1483]: time="2025-03-17T17:49:32.315236299Z" level=info msg="StartContainer for \"7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50\"" Mar 17 17:49:32.342191 systemd[1]: Started cri-containerd-7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50.scope - libcontainer container 7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50. Mar 17 17:49:32.370856 containerd[1483]: time="2025-03-17T17:49:32.370796403Z" level=info msg="StartContainer for \"7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50\" returns successfully" Mar 17 17:49:32.738262 kubelet[2562]: E0317 17:49:32.738108 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:33.164216 systemd[1]: cri-containerd-7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50.scope: Deactivated successfully. Mar 17 17:49:33.168463 systemd[1]: cri-containerd-7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50.scope: Consumed 444ms CPU time, 156.9M memory peak, 4K read from disk, 150.3M written to disk. Mar 17 17:49:33.182394 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50-rootfs.mount: Deactivated successfully. Mar 17 17:49:33.218032 containerd[1483]: time="2025-03-17T17:49:33.217891563Z" level=info msg="shim disconnected" id=7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50 namespace=k8s.io Mar 17 17:49:33.218032 containerd[1483]: time="2025-03-17T17:49:33.217968402Z" level=warning msg="cleaning up after shim disconnected" id=7782fef5035fa7b246588e09fea9b21f96dff91d81b3241a8c321ddc654cdd50 namespace=k8s.io Mar 17 17:49:33.218032 containerd[1483]: time="2025-03-17T17:49:33.217975962Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 17 17:49:33.263386 kubelet[2562]: I0317 17:49:33.263356 2562 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 17 17:49:33.300316 systemd[1]: Created slice kubepods-burstable-podeddc624d_899a_46cb_92b1_5f442cda8c19.slice - libcontainer container kubepods-burstable-podeddc624d_899a_46cb_92b1_5f442cda8c19.slice. Mar 17 17:49:33.307518 systemd[1]: Created slice kubepods-besteffort-podb4b132a3_c152_4a3d_ac76_fe05e87a1881.slice - libcontainer container kubepods-besteffort-podb4b132a3_c152_4a3d_ac76_fe05e87a1881.slice. Mar 17 17:49:33.317235 systemd[1]: Created slice kubepods-burstable-pod458cde0b_a7e6_4774_81d4_b10558db7a0b.slice - libcontainer container kubepods-burstable-pod458cde0b_a7e6_4774_81d4_b10558db7a0b.slice. Mar 17 17:49:33.322495 systemd[1]: Created slice kubepods-besteffort-pod67d8c30c_b1cb_4e16_9586_f6358511bb7c.slice - libcontainer container kubepods-besteffort-pod67d8c30c_b1cb_4e16_9586_f6358511bb7c.slice. Mar 17 17:49:33.334739 systemd[1]: Created slice kubepods-besteffort-podcad2d73e_b1c8_4d7b_b3f2_125742f00ac4.slice - libcontainer container kubepods-besteffort-podcad2d73e_b1c8_4d7b_b3f2_125742f00ac4.slice. Mar 17 17:49:33.388882 kubelet[2562]: I0317 17:49:33.388613 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grwkc\" (UniqueName: \"kubernetes.io/projected/67d8c30c-b1cb-4e16-9586-f6358511bb7c-kube-api-access-grwkc\") pod \"calico-apiserver-5d54787977-q7h42\" (UID: \"67d8c30c-b1cb-4e16-9586-f6358511bb7c\") " pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:33.388882 kubelet[2562]: I0317 17:49:33.388665 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eddc624d-899a-46cb-92b1-5f442cda8c19-config-volume\") pod \"coredns-6f6b679f8f-594fp\" (UID: \"eddc624d-899a-46cb-92b1-5f442cda8c19\") " pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:33.388882 kubelet[2562]: I0317 17:49:33.388683 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smdzj\" (UniqueName: \"kubernetes.io/projected/eddc624d-899a-46cb-92b1-5f442cda8c19-kube-api-access-smdzj\") pod \"coredns-6f6b679f8f-594fp\" (UID: \"eddc624d-899a-46cb-92b1-5f442cda8c19\") " pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:33.388882 kubelet[2562]: I0317 17:49:33.388701 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/67d8c30c-b1cb-4e16-9586-f6358511bb7c-calico-apiserver-certs\") pod \"calico-apiserver-5d54787977-q7h42\" (UID: \"67d8c30c-b1cb-4e16-9586-f6358511bb7c\") " pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:33.388882 kubelet[2562]: I0317 17:49:33.388720 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/458cde0b-a7e6-4774-81d4-b10558db7a0b-config-volume\") pod \"coredns-6f6b679f8f-xxmwz\" (UID: \"458cde0b-a7e6-4774-81d4-b10558db7a0b\") " pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:33.389131 kubelet[2562]: I0317 17:49:33.388735 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlgj6\" (UniqueName: \"kubernetes.io/projected/458cde0b-a7e6-4774-81d4-b10558db7a0b-kube-api-access-zlgj6\") pod \"coredns-6f6b679f8f-xxmwz\" (UID: \"458cde0b-a7e6-4774-81d4-b10558db7a0b\") " pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:33.389131 kubelet[2562]: I0317 17:49:33.388752 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cad2d73e-b1c8-4d7b-b3f2-125742f00ac4-calico-apiserver-certs\") pod \"calico-apiserver-5d54787977-bh55d\" (UID: \"cad2d73e-b1c8-4d7b-b3f2-125742f00ac4\") " pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:33.389131 kubelet[2562]: I0317 17:49:33.388767 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4b132a3-c152-4a3d-ac76-fe05e87a1881-tigera-ca-bundle\") pod \"calico-kube-controllers-65cf745c96-whxkh\" (UID: \"b4b132a3-c152-4a3d-ac76-fe05e87a1881\") " pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:33.389131 kubelet[2562]: I0317 17:49:33.388786 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj94v\" (UniqueName: \"kubernetes.io/projected/cad2d73e-b1c8-4d7b-b3f2-125742f00ac4-kube-api-access-fj94v\") pod \"calico-apiserver-5d54787977-bh55d\" (UID: \"cad2d73e-b1c8-4d7b-b3f2-125742f00ac4\") " pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:33.389131 kubelet[2562]: I0317 17:49:33.388803 2562 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfvwq\" (UniqueName: \"kubernetes.io/projected/b4b132a3-c152-4a3d-ac76-fe05e87a1881-kube-api-access-rfvwq\") pod \"calico-kube-controllers-65cf745c96-whxkh\" (UID: \"b4b132a3-c152-4a3d-ac76-fe05e87a1881\") " pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:33.605728 containerd[1483]: time="2025-03-17T17:49:33.605679018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:0,}" Mar 17 17:49:33.611774 containerd[1483]: time="2025-03-17T17:49:33.611739329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:0,}" Mar 17 17:49:33.620848 containerd[1483]: time="2025-03-17T17:49:33.620610261Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:0,}" Mar 17 17:49:33.630134 containerd[1483]: time="2025-03-17T17:49:33.630084220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:49:33.643214 containerd[1483]: time="2025-03-17T17:49:33.643175062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:0,}" Mar 17 17:49:33.826303 containerd[1483]: time="2025-03-17T17:49:33.826162821Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 17 17:49:33.925880 kubelet[2562]: I0317 17:49:33.925321 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:49:34.007690 containerd[1483]: time="2025-03-17T17:49:34.007640256Z" level=error msg="Failed to destroy network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.011785 containerd[1483]: time="2025-03-17T17:49:34.010977427Z" level=error msg="Failed to destroy network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.011785 containerd[1483]: time="2025-03-17T17:49:34.011312980Z" level=error msg="encountered an error cleaning up failed sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.011785 containerd[1483]: time="2025-03-17T17:49:34.011372979Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.014501 kubelet[2562]: E0317 17:49:34.014245 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.014501 kubelet[2562]: E0317 17:49:34.014343 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:34.014501 kubelet[2562]: E0317 17:49:34.014363 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:34.014920 kubelet[2562]: E0317 17:49:34.014423 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" podUID="67d8c30c-b1cb-4e16-9586-f6358511bb7c" Mar 17 17:49:34.021548 containerd[1483]: time="2025-03-17T17:49:34.020296876Z" level=error msg="encountered an error cleaning up failed sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.021647 containerd[1483]: time="2025-03-17T17:49:34.021622249Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.022521 kubelet[2562]: E0317 17:49:34.022475 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.022894 kubelet[2562]: E0317 17:49:34.022860 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:34.022957 kubelet[2562]: E0317 17:49:34.022896 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:34.022992 kubelet[2562]: E0317 17:49:34.022966 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" podUID="b4b132a3-c152-4a3d-ac76-fe05e87a1881" Mar 17 17:49:34.028164 containerd[1483]: time="2025-03-17T17:49:34.028102715Z" level=error msg="Failed to destroy network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.028963 containerd[1483]: time="2025-03-17T17:49:34.028767622Z" level=error msg="encountered an error cleaning up failed sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.028963 containerd[1483]: time="2025-03-17T17:49:34.028829740Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.029840 kubelet[2562]: E0317 17:49:34.029787 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.029901 kubelet[2562]: E0317 17:49:34.029857 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:34.029901 kubelet[2562]: E0317 17:49:34.029877 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:34.029986 kubelet[2562]: E0317 17:49:34.029946 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-594fp" podUID="eddc624d-899a-46cb-92b1-5f442cda8c19" Mar 17 17:49:34.046558 containerd[1483]: time="2025-03-17T17:49:34.046472578Z" level=error msg="Failed to destroy network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.047769 containerd[1483]: time="2025-03-17T17:49:34.047584875Z" level=error msg="encountered an error cleaning up failed sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.047769 containerd[1483]: time="2025-03-17T17:49:34.047654714Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.048400 kubelet[2562]: E0317 17:49:34.048051 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.048400 kubelet[2562]: E0317 17:49:34.048235 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:34.048400 kubelet[2562]: E0317 17:49:34.048257 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:34.050112 kubelet[2562]: E0317 17:49:34.048380 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" podUID="cad2d73e-b1c8-4d7b-b3f2-125742f00ac4" Mar 17 17:49:34.065410 containerd[1483]: time="2025-03-17T17:49:34.065340670Z" level=error msg="Failed to destroy network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.065892 containerd[1483]: time="2025-03-17T17:49:34.065675663Z" level=error msg="encountered an error cleaning up failed sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.065892 containerd[1483]: time="2025-03-17T17:49:34.065744382Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.066143 kubelet[2562]: E0317 17:49:34.065994 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.066143 kubelet[2562]: E0317 17:49:34.066060 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:34.066143 kubelet[2562]: E0317 17:49:34.066081 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:34.066236 kubelet[2562]: E0317 17:49:34.066120 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xxmwz" podUID="458cde0b-a7e6-4774-81d4-b10558db7a0b" Mar 17 17:49:34.497238 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0-shm.mount: Deactivated successfully. Mar 17 17:49:34.743580 systemd[1]: Created slice kubepods-besteffort-pod1bd3d1fb_9cd7_48fa_86bf_ebd468d40871.slice - libcontainer container kubepods-besteffort-pod1bd3d1fb_9cd7_48fa_86bf_ebd468d40871.slice. Mar 17 17:49:34.745876 containerd[1483]: time="2025-03-17T17:49:34.745561693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:0,}" Mar 17 17:49:34.804784 containerd[1483]: time="2025-03-17T17:49:34.804666958Z" level=error msg="Failed to destroy network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.805041 containerd[1483]: time="2025-03-17T17:49:34.805015871Z" level=error msg="encountered an error cleaning up failed sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.805093 containerd[1483]: time="2025-03-17T17:49:34.805074910Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.806410 kubelet[2562]: E0317 17:49:34.806370 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.806483 kubelet[2562]: E0317 17:49:34.806443 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:34.806483 kubelet[2562]: E0317 17:49:34.806463 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:34.806543 kubelet[2562]: E0317 17:49:34.806509 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:34.806622 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb-shm.mount: Deactivated successfully. Mar 17 17:49:34.822428 kubelet[2562]: I0317 17:49:34.822397 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462" Mar 17 17:49:34.823284 containerd[1483]: time="2025-03-17T17:49:34.823249096Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" Mar 17 17:49:34.825303 kubelet[2562]: I0317 17:49:34.825280 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0" Mar 17 17:49:34.826080 containerd[1483]: time="2025-03-17T17:49:34.826011079Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" Mar 17 17:49:34.826702 containerd[1483]: time="2025-03-17T17:49:34.826604267Z" level=info msg="Ensure that sandbox 2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0 in task-service has been cleanup successfully" Mar 17 17:49:34.826791 containerd[1483]: time="2025-03-17T17:49:34.826726905Z" level=info msg="Ensure that sandbox bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462 in task-service has been cleanup successfully" Mar 17 17:49:34.827022 kubelet[2562]: I0317 17:49:34.826999 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb" Mar 17 17:49:34.827334 containerd[1483]: time="2025-03-17T17:49:34.827304133Z" level=info msg="TearDown network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" successfully" Mar 17 17:49:34.827547 containerd[1483]: time="2025-03-17T17:49:34.827402211Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" returns successfully" Mar 17 17:49:34.827547 containerd[1483]: time="2025-03-17T17:49:34.827301533Z" level=info msg="TearDown network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" successfully" Mar 17 17:49:34.827547 containerd[1483]: time="2025-03-17T17:49:34.827509969Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" returns successfully" Mar 17 17:49:34.828755 containerd[1483]: time="2025-03-17T17:49:34.828139516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:1,}" Mar 17 17:49:34.828755 containerd[1483]: time="2025-03-17T17:49:34.828214114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:1,}" Mar 17 17:49:34.828755 containerd[1483]: time="2025-03-17T17:49:34.828400990Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" Mar 17 17:49:34.828755 containerd[1483]: time="2025-03-17T17:49:34.828611586Z" level=info msg="Ensure that sandbox 2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb in task-service has been cleanup successfully" Mar 17 17:49:34.828655 systemd[1]: run-netns-cni\x2d30a17eb3\x2d3090\x2d79ce\x2d9127\x2d0f7cc35ef2de.mount: Deactivated successfully. Mar 17 17:49:34.828780 systemd[1]: run-netns-cni\x2d74e2813d\x2dd5ed\x2d656f\x2d9e82\x2de83b102e8783.mount: Deactivated successfully. Mar 17 17:49:34.829588 containerd[1483]: time="2025-03-17T17:49:34.829404130Z" level=info msg="TearDown network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" successfully" Mar 17 17:49:34.829588 containerd[1483]: time="2025-03-17T17:49:34.829530287Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" returns successfully" Mar 17 17:49:34.830291 containerd[1483]: time="2025-03-17T17:49:34.830251992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:1,}" Mar 17 17:49:34.831775 kubelet[2562]: I0317 17:49:34.831749 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60" Mar 17 17:49:34.832761 containerd[1483]: time="2025-03-17T17:49:34.832659983Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" Mar 17 17:49:34.833047 containerd[1483]: time="2025-03-17T17:49:34.833016735Z" level=info msg="Ensure that sandbox 4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60 in task-service has been cleanup successfully" Mar 17 17:49:34.833704 containerd[1483]: time="2025-03-17T17:49:34.833424687Z" level=info msg="TearDown network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" successfully" Mar 17 17:49:34.833704 containerd[1483]: time="2025-03-17T17:49:34.833490486Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" returns successfully" Mar 17 17:49:34.834255 containerd[1483]: time="2025-03-17T17:49:34.834219511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:49:34.834767 kubelet[2562]: I0317 17:49:34.834621 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec" Mar 17 17:49:34.835528 containerd[1483]: time="2025-03-17T17:49:34.835270249Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" Mar 17 17:49:34.835528 containerd[1483]: time="2025-03-17T17:49:34.835448765Z" level=info msg="Ensure that sandbox a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec in task-service has been cleanup successfully" Mar 17 17:49:34.836273 containerd[1483]: time="2025-03-17T17:49:34.836249509Z" level=info msg="TearDown network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" successfully" Mar 17 17:49:34.836273 containerd[1483]: time="2025-03-17T17:49:34.836272429Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" returns successfully" Mar 17 17:49:34.836612 kubelet[2562]: I0317 17:49:34.836588 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c" Mar 17 17:49:34.837632 containerd[1483]: time="2025-03-17T17:49:34.837599281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:1,}" Mar 17 17:49:34.847154 containerd[1483]: time="2025-03-17T17:49:34.847103046Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" Mar 17 17:49:34.847301 containerd[1483]: time="2025-03-17T17:49:34.847279162Z" level=info msg="Ensure that sandbox ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c in task-service has been cleanup successfully" Mar 17 17:49:34.847480 containerd[1483]: time="2025-03-17T17:49:34.847440999Z" level=info msg="TearDown network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" successfully" Mar 17 17:49:34.847480 containerd[1483]: time="2025-03-17T17:49:34.847458799Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" returns successfully" Mar 17 17:49:34.848296 containerd[1483]: time="2025-03-17T17:49:34.848265302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:1,}" Mar 17 17:49:34.982401 containerd[1483]: time="2025-03-17T17:49:34.982356667Z" level=error msg="Failed to destroy network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.984180 containerd[1483]: time="2025-03-17T17:49:34.984065592Z" level=error msg="Failed to destroy network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.985284 containerd[1483]: time="2025-03-17T17:49:34.984977333Z" level=error msg="encountered an error cleaning up failed sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.985284 containerd[1483]: time="2025-03-17T17:49:34.985150649Z" level=error msg="encountered an error cleaning up failed sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.985381 containerd[1483]: time="2025-03-17T17:49:34.985291406Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.985534 kubelet[2562]: E0317 17:49:34.985495 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.985785 kubelet[2562]: E0317 17:49:34.985555 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:34.985785 kubelet[2562]: E0317 17:49:34.985575 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:34.985785 kubelet[2562]: E0317 17:49:34.985620 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" podUID="b4b132a3-c152-4a3d-ac76-fe05e87a1881" Mar 17 17:49:34.985888 containerd[1483]: time="2025-03-17T17:49:34.985497122Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.987986 kubelet[2562]: E0317 17:49:34.985891 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:34.987986 kubelet[2562]: E0317 17:49:34.986100 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:34.987986 kubelet[2562]: E0317 17:49:34.986119 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:34.988126 kubelet[2562]: E0317 17:49:34.986154 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-594fp" podUID="eddc624d-899a-46cb-92b1-5f442cda8c19" Mar 17 17:49:35.007232 containerd[1483]: time="2025-03-17T17:49:35.005041283Z" level=error msg="Failed to destroy network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.007232 containerd[1483]: time="2025-03-17T17:49:35.007005284Z" level=error msg="encountered an error cleaning up failed sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.007232 containerd[1483]: time="2025-03-17T17:49:35.007070003Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.007550 kubelet[2562]: E0317 17:49:35.007283 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.007550 kubelet[2562]: E0317 17:49:35.007345 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:35.007550 kubelet[2562]: E0317 17:49:35.007362 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:35.008323 kubelet[2562]: E0317 17:49:35.007402 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:35.014743 containerd[1483]: time="2025-03-17T17:49:35.014354858Z" level=error msg="Failed to destroy network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.020626 containerd[1483]: time="2025-03-17T17:49:35.020579454Z" level=error msg="encountered an error cleaning up failed sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.024450 containerd[1483]: time="2025-03-17T17:49:35.021990946Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.026241 kubelet[2562]: E0317 17:49:35.022212 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.026241 kubelet[2562]: E0317 17:49:35.022294 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:35.026241 kubelet[2562]: E0317 17:49:35.022313 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:35.026357 kubelet[2562]: E0317 17:49:35.022356 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" podUID="67d8c30c-b1cb-4e16-9586-f6358511bb7c" Mar 17 17:49:35.028937 containerd[1483]: time="2025-03-17T17:49:35.028658613Z" level=error msg="Failed to destroy network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.029069 containerd[1483]: time="2025-03-17T17:49:35.029030726Z" level=error msg="encountered an error cleaning up failed sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.029140 containerd[1483]: time="2025-03-17T17:49:35.029112764Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.029390 kubelet[2562]: E0317 17:49:35.029338 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.029450 kubelet[2562]: E0317 17:49:35.029402 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:35.029450 kubelet[2562]: E0317 17:49:35.029424 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:35.029539 kubelet[2562]: E0317 17:49:35.029490 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" podUID="cad2d73e-b1c8-4d7b-b3f2-125742f00ac4" Mar 17 17:49:35.065197 containerd[1483]: time="2025-03-17T17:49:35.065083568Z" level=error msg="Failed to destroy network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.065461 containerd[1483]: time="2025-03-17T17:49:35.065414241Z" level=error msg="encountered an error cleaning up failed sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.065505 containerd[1483]: time="2025-03-17T17:49:35.065485480Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.065727 kubelet[2562]: E0317 17:49:35.065694 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:35.066002 kubelet[2562]: E0317 17:49:35.065848 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:35.066002 kubelet[2562]: E0317 17:49:35.065895 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:35.066002 kubelet[2562]: E0317 17:49:35.065957 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xxmwz" podUID="458cde0b-a7e6-4774-81d4-b10558db7a0b" Mar 17 17:49:35.499559 systemd[1]: run-netns-cni\x2db4b33b34\x2d1a6f\x2dcf02\x2d5c98\x2dd06e4cd1275a.mount: Deactivated successfully. Mar 17 17:49:35.499637 systemd[1]: run-netns-cni\x2da02da953\x2da0de\x2d268f\x2d71c0\x2dfa82b5ceaf6d.mount: Deactivated successfully. Mar 17 17:49:35.499681 systemd[1]: run-netns-cni\x2d001d9124\x2dde39\x2d433f\x2d2055\x2de6bee467eaea.mount: Deactivated successfully. Mar 17 17:49:35.499722 systemd[1]: run-netns-cni\x2d9435233e\x2dc85a\x2d287c\x2ddc73\x2d620a9b758795.mount: Deactivated successfully. Mar 17 17:49:35.840922 kubelet[2562]: I0317 17:49:35.840610 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9" Mar 17 17:49:35.841699 containerd[1483]: time="2025-03-17T17:49:35.841664229Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\"" Mar 17 17:49:35.842000 containerd[1483]: time="2025-03-17T17:49:35.841833026Z" level=info msg="Ensure that sandbox b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9 in task-service has been cleanup successfully" Mar 17 17:49:35.845242 systemd[1]: run-netns-cni\x2db16cf454\x2d92d7\x2d8a28\x2d1a79\x2dda1963626225.mount: Deactivated successfully. Mar 17 17:49:35.845624 kubelet[2562]: I0317 17:49:35.845588 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a" Mar 17 17:49:35.846318 containerd[1483]: time="2025-03-17T17:49:35.846277217Z" level=info msg="TearDown network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" successfully" Mar 17 17:49:35.846318 containerd[1483]: time="2025-03-17T17:49:35.846305056Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" returns successfully" Mar 17 17:49:35.847041 containerd[1483]: time="2025-03-17T17:49:35.846944124Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\"" Mar 17 17:49:35.847041 containerd[1483]: time="2025-03-17T17:49:35.847032362Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" Mar 17 17:49:35.847133 containerd[1483]: time="2025-03-17T17:49:35.847105281Z" level=info msg="TearDown network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" successfully" Mar 17 17:49:35.847133 containerd[1483]: time="2025-03-17T17:49:35.847114960Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" returns successfully" Mar 17 17:49:35.847175 containerd[1483]: time="2025-03-17T17:49:35.847134000Z" level=info msg="Ensure that sandbox 2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a in task-service has been cleanup successfully" Mar 17 17:49:35.847780 kubelet[2562]: I0317 17:49:35.847738 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc" Mar 17 17:49:35.847840 containerd[1483]: time="2025-03-17T17:49:35.847770387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:2,}" Mar 17 17:49:35.848148 containerd[1483]: time="2025-03-17T17:49:35.848110981Z" level=info msg="TearDown network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" successfully" Mar 17 17:49:35.848148 containerd[1483]: time="2025-03-17T17:49:35.848142900Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" returns successfully" Mar 17 17:49:35.849404 containerd[1483]: time="2025-03-17T17:49:35.848284977Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\"" Mar 17 17:49:35.849404 containerd[1483]: time="2025-03-17T17:49:35.848463454Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" Mar 17 17:49:35.849404 containerd[1483]: time="2025-03-17T17:49:35.848480293Z" level=info msg="Ensure that sandbox 50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc in task-service has been cleanup successfully" Mar 17 17:49:35.849404 containerd[1483]: time="2025-03-17T17:49:35.848542452Z" level=info msg="TearDown network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" successfully" Mar 17 17:49:35.849404 containerd[1483]: time="2025-03-17T17:49:35.848552332Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" returns successfully" Mar 17 17:49:35.849404 containerd[1483]: time="2025-03-17T17:49:35.848966924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:49:35.849149 systemd[1]: run-netns-cni\x2dad0177b0\x2d7a1c\x2df950\x2d7ec6\x2d81041ee5d688.mount: Deactivated successfully. Mar 17 17:49:35.849656 containerd[1483]: time="2025-03-17T17:49:35.849593671Z" level=info msg="TearDown network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" successfully" Mar 17 17:49:35.849656 containerd[1483]: time="2025-03-17T17:49:35.849613911Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" returns successfully" Mar 17 17:49:35.850185 containerd[1483]: time="2025-03-17T17:49:35.850044182Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" Mar 17 17:49:35.850250 containerd[1483]: time="2025-03-17T17:49:35.850234058Z" level=info msg="TearDown network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" successfully" Mar 17 17:49:35.850275 containerd[1483]: time="2025-03-17T17:49:35.850251618Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" returns successfully" Mar 17 17:49:35.850864 kubelet[2562]: I0317 17:49:35.850747 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b" Mar 17 17:49:35.851610 containerd[1483]: time="2025-03-17T17:49:35.851544632Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\"" Mar 17 17:49:35.851671 containerd[1483]: time="2025-03-17T17:49:35.851646110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:2,}" Mar 17 17:49:35.851836 containerd[1483]: time="2025-03-17T17:49:35.851688629Z" level=info msg="Ensure that sandbox 1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b in task-service has been cleanup successfully" Mar 17 17:49:35.852715 containerd[1483]: time="2025-03-17T17:49:35.851900585Z" level=info msg="TearDown network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" successfully" Mar 17 17:49:35.852715 containerd[1483]: time="2025-03-17T17:49:35.851941704Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" returns successfully" Mar 17 17:49:35.852426 systemd[1]: run-netns-cni\x2d513e5c5c\x2d5bcc\x2d224f\x2d6699\x2d15e99f7eca5a.mount: Deactivated successfully. Mar 17 17:49:35.853430 containerd[1483]: time="2025-03-17T17:49:35.853259038Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" Mar 17 17:49:35.853430 containerd[1483]: time="2025-03-17T17:49:35.853333397Z" level=info msg="TearDown network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" successfully" Mar 17 17:49:35.853430 containerd[1483]: time="2025-03-17T17:49:35.853342996Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" returns successfully" Mar 17 17:49:35.854459 containerd[1483]: time="2025-03-17T17:49:35.854382976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:2,}" Mar 17 17:49:35.855149 kubelet[2562]: I0317 17:49:35.855113 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615" Mar 17 17:49:35.855809 containerd[1483]: time="2025-03-17T17:49:35.855580112Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\"" Mar 17 17:49:35.855809 containerd[1483]: time="2025-03-17T17:49:35.855726589Z" level=info msg="Ensure that sandbox 226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615 in task-service has been cleanup successfully" Mar 17 17:49:35.855681 systemd[1]: run-netns-cni\x2df1b85187\x2d016c\x2d3c9d\x2d8acb\x2de3420e87990d.mount: Deactivated successfully. Mar 17 17:49:35.856441 containerd[1483]: time="2025-03-17T17:49:35.855981624Z" level=info msg="TearDown network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" successfully" Mar 17 17:49:35.856441 containerd[1483]: time="2025-03-17T17:49:35.855998624Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" returns successfully" Mar 17 17:49:35.862736 containerd[1483]: time="2025-03-17T17:49:35.862614252Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" Mar 17 17:49:35.862736 containerd[1483]: time="2025-03-17T17:49:35.862716930Z" level=info msg="TearDown network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" successfully" Mar 17 17:49:35.862736 containerd[1483]: time="2025-03-17T17:49:35.862729130Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" returns successfully" Mar 17 17:49:35.865250 containerd[1483]: time="2025-03-17T17:49:35.865219320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:2,}" Mar 17 17:49:35.865741 kubelet[2562]: I0317 17:49:35.865680 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e" Mar 17 17:49:35.866722 containerd[1483]: time="2025-03-17T17:49:35.866696531Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\"" Mar 17 17:49:35.866895 containerd[1483]: time="2025-03-17T17:49:35.866854367Z" level=info msg="Ensure that sandbox 0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e in task-service has been cleanup successfully" Mar 17 17:49:35.867565 containerd[1483]: time="2025-03-17T17:49:35.867542354Z" level=info msg="TearDown network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" successfully" Mar 17 17:49:35.867565 containerd[1483]: time="2025-03-17T17:49:35.867563513Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" returns successfully" Mar 17 17:49:35.868012 containerd[1483]: time="2025-03-17T17:49:35.867935266Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" Mar 17 17:49:35.868054 containerd[1483]: time="2025-03-17T17:49:35.868015824Z" level=info msg="TearDown network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" successfully" Mar 17 17:49:35.868054 containerd[1483]: time="2025-03-17T17:49:35.868026704Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" returns successfully" Mar 17 17:49:35.868647 containerd[1483]: time="2025-03-17T17:49:35.868621292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:2,}" Mar 17 17:49:36.067662 containerd[1483]: time="2025-03-17T17:49:36.067602373Z" level=error msg="Failed to destroy network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.070255 containerd[1483]: time="2025-03-17T17:49:36.070203803Z" level=error msg="encountered an error cleaning up failed sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.070365 containerd[1483]: time="2025-03-17T17:49:36.070280761Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.070647 kubelet[2562]: E0317 17:49:36.070537 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.070940 kubelet[2562]: E0317 17:49:36.070674 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:36.070940 kubelet[2562]: E0317 17:49:36.070697 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:36.070940 kubelet[2562]: E0317 17:49:36.070744 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" podUID="67d8c30c-b1cb-4e16-9586-f6358511bb7c" Mar 17 17:49:36.080062 containerd[1483]: time="2025-03-17T17:49:36.080010013Z" level=error msg="Failed to destroy network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.080536 containerd[1483]: time="2025-03-17T17:49:36.080497644Z" level=error msg="encountered an error cleaning up failed sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.091130 containerd[1483]: time="2025-03-17T17:49:36.091020001Z" level=error msg="Failed to destroy network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.091577 containerd[1483]: time="2025-03-17T17:49:36.091525631Z" level=error msg="encountered an error cleaning up failed sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.092446 containerd[1483]: time="2025-03-17T17:49:36.092310776Z" level=error msg="Failed to destroy network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.093575 containerd[1483]: time="2025-03-17T17:49:36.093450434Z" level=error msg="encountered an error cleaning up failed sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.096585 containerd[1483]: time="2025-03-17T17:49:36.096360658Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.096585 containerd[1483]: time="2025-03-17T17:49:36.096476096Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.096585 containerd[1483]: time="2025-03-17T17:49:36.096516895Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.096791 kubelet[2562]: E0317 17:49:36.096751 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.096848 kubelet[2562]: E0317 17:49:36.096795 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.096848 kubelet[2562]: E0317 17:49:36.096817 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:36.096848 kubelet[2562]: E0317 17:49:36.096834 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:36.096931 kubelet[2562]: E0317 17:49:36.096853 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:36.096931 kubelet[2562]: E0317 17:49:36.096835 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:36.096982 kubelet[2562]: E0317 17:49:36.096960 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" podUID="cad2d73e-b1c8-4d7b-b3f2-125742f00ac4" Mar 17 17:49:36.097250 kubelet[2562]: E0317 17:49:36.097207 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xxmwz" podUID="458cde0b-a7e6-4774-81d4-b10558db7a0b" Mar 17 17:49:36.097322 kubelet[2562]: E0317 17:49:36.096753 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.097322 kubelet[2562]: E0317 17:49:36.097287 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:36.097371 kubelet[2562]: E0317 17:49:36.097302 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:36.097394 kubelet[2562]: E0317 17:49:36.097364 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-594fp" podUID="eddc624d-899a-46cb-92b1-5f442cda8c19" Mar 17 17:49:36.100074 containerd[1483]: time="2025-03-17T17:49:36.099947149Z" level=error msg="Failed to destroy network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.100366 containerd[1483]: time="2025-03-17T17:49:36.100339661Z" level=error msg="encountered an error cleaning up failed sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.100600 containerd[1483]: time="2025-03-17T17:49:36.100517018Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.100879 kubelet[2562]: E0317 17:49:36.100838 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.100949 kubelet[2562]: E0317 17:49:36.100920 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:36.100949 kubelet[2562]: E0317 17:49:36.100941 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:36.101009 kubelet[2562]: E0317 17:49:36.100983 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:36.108954 containerd[1483]: time="2025-03-17T17:49:36.108816658Z" level=error msg="Failed to destroy network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.109455 containerd[1483]: time="2025-03-17T17:49:36.109395087Z" level=error msg="encountered an error cleaning up failed sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.109719 containerd[1483]: time="2025-03-17T17:49:36.109619922Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.109868 kubelet[2562]: E0317 17:49:36.109822 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:36.109927 kubelet[2562]: E0317 17:49:36.109885 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:36.109927 kubelet[2562]: E0317 17:49:36.109903 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:36.109979 kubelet[2562]: E0317 17:49:36.109954 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" podUID="b4b132a3-c152-4a3d-ac76-fe05e87a1881" Mar 17 17:49:36.498297 systemd[1]: run-netns-cni\x2deb2aef88\x2db858\x2d805b\x2d53aa\x2dbebfd1c15ef4.mount: Deactivated successfully. Mar 17 17:49:36.498801 systemd[1]: run-netns-cni\x2d27aed242\x2d3137\x2d7b41\x2d0c6b\x2db33a2f975ffb.mount: Deactivated successfully. Mar 17 17:49:36.869172 kubelet[2562]: I0317 17:49:36.869077 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5" Mar 17 17:49:36.869870 containerd[1483]: time="2025-03-17T17:49:36.869831382Z" level=info msg="StopPodSandbox for \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\"" Mar 17 17:49:36.870189 containerd[1483]: time="2025-03-17T17:49:36.870051458Z" level=info msg="Ensure that sandbox 2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5 in task-service has been cleanup successfully" Mar 17 17:49:36.871929 containerd[1483]: time="2025-03-17T17:49:36.870816043Z" level=info msg="TearDown network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" successfully" Mar 17 17:49:36.871929 containerd[1483]: time="2025-03-17T17:49:36.870839283Z" level=info msg="StopPodSandbox for \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" returns successfully" Mar 17 17:49:36.872374 systemd[1]: run-netns-cni\x2dfe18afcd\x2d3567\x2d05aa\x2dae9f\x2debbaf8a7625e.mount: Deactivated successfully. Mar 17 17:49:36.875343 kubelet[2562]: I0317 17:49:36.874515 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6" Mar 17 17:49:36.875437 containerd[1483]: time="2025-03-17T17:49:36.874975243Z" level=info msg="StopPodSandbox for \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\"" Mar 17 17:49:36.875437 containerd[1483]: time="2025-03-17T17:49:36.875142280Z" level=info msg="Ensure that sandbox 39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6 in task-service has been cleanup successfully" Mar 17 17:49:36.875437 containerd[1483]: time="2025-03-17T17:49:36.875325636Z" level=info msg="TearDown network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" successfully" Mar 17 17:49:36.875437 containerd[1483]: time="2025-03-17T17:49:36.875339036Z" level=info msg="StopPodSandbox for \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" returns successfully" Mar 17 17:49:36.876719 containerd[1483]: time="2025-03-17T17:49:36.876688930Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\"" Mar 17 17:49:36.876988 containerd[1483]: time="2025-03-17T17:49:36.876967764Z" level=info msg="TearDown network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" successfully" Mar 17 17:49:36.877063 containerd[1483]: time="2025-03-17T17:49:36.877049323Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" returns successfully" Mar 17 17:49:36.877805 containerd[1483]: time="2025-03-17T17:49:36.877777429Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" Mar 17 17:49:36.877985 containerd[1483]: time="2025-03-17T17:49:36.877968745Z" level=info msg="TearDown network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" successfully" Mar 17 17:49:36.878048 kubelet[2562]: I0317 17:49:36.877973 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0" Mar 17 17:49:36.878124 containerd[1483]: time="2025-03-17T17:49:36.878107303Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" returns successfully" Mar 17 17:49:36.878471 systemd[1]: run-netns-cni\x2dfa1ab9eb\x2dfe34\x2d9deb\x2d11c3\x2d2a067059f93f.mount: Deactivated successfully. Mar 17 17:49:36.879826 containerd[1483]: time="2025-03-17T17:49:36.879443077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:49:36.879826 containerd[1483]: time="2025-03-17T17:49:36.879568874Z" level=info msg="StopPodSandbox for \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\"" Mar 17 17:49:36.879826 containerd[1483]: time="2025-03-17T17:49:36.879703872Z" level=info msg="Ensure that sandbox e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0 in task-service has been cleanup successfully" Mar 17 17:49:36.880134 containerd[1483]: time="2025-03-17T17:49:36.880113944Z" level=info msg="TearDown network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" successfully" Mar 17 17:49:36.880280 containerd[1483]: time="2025-03-17T17:49:36.880253061Z" level=info msg="StopPodSandbox for \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" returns successfully" Mar 17 17:49:36.881620 systemd[1]: run-netns-cni\x2d73ea677f\x2d6cf4\x2d6184\x2d333c\x2dad457be1bd69.mount: Deactivated successfully. Mar 17 17:49:36.883068 containerd[1483]: time="2025-03-17T17:49:36.882766653Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\"" Mar 17 17:49:36.883068 containerd[1483]: time="2025-03-17T17:49:36.882864131Z" level=info msg="TearDown network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" successfully" Mar 17 17:49:36.883068 containerd[1483]: time="2025-03-17T17:49:36.882875331Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" returns successfully" Mar 17 17:49:36.883369 containerd[1483]: time="2025-03-17T17:49:36.883347201Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" Mar 17 17:49:36.883554 containerd[1483]: time="2025-03-17T17:49:36.883508758Z" level=info msg="TearDown network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" successfully" Mar 17 17:49:36.883718 containerd[1483]: time="2025-03-17T17:49:36.883699435Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" returns successfully" Mar 17 17:49:36.884356 containerd[1483]: time="2025-03-17T17:49:36.884322023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:3,}" Mar 17 17:49:36.884499 kubelet[2562]: I0317 17:49:36.884476 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246" Mar 17 17:49:36.887962 containerd[1483]: time="2025-03-17T17:49:36.887683198Z" level=info msg="StopPodSandbox for \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\"" Mar 17 17:49:36.887962 containerd[1483]: time="2025-03-17T17:49:36.887866834Z" level=info msg="Ensure that sandbox 0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246 in task-service has been cleanup successfully" Mar 17 17:49:36.888631 containerd[1483]: time="2025-03-17T17:49:36.888390744Z" level=info msg="TearDown network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" successfully" Mar 17 17:49:36.888631 containerd[1483]: time="2025-03-17T17:49:36.888421984Z" level=info msg="StopPodSandbox for \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" returns successfully" Mar 17 17:49:36.889252 containerd[1483]: time="2025-03-17T17:49:36.888718698Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\"" Mar 17 17:49:36.889605 containerd[1483]: time="2025-03-17T17:49:36.889412124Z" level=info msg="TearDown network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" successfully" Mar 17 17:49:36.889605 containerd[1483]: time="2025-03-17T17:49:36.889438604Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" returns successfully" Mar 17 17:49:36.889605 containerd[1483]: time="2025-03-17T17:49:36.889467763Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\"" Mar 17 17:49:36.891055 containerd[1483]: time="2025-03-17T17:49:36.889700599Z" level=info msg="TearDown network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" successfully" Mar 17 17:49:36.891055 containerd[1483]: time="2025-03-17T17:49:36.889739598Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" returns successfully" Mar 17 17:49:36.891055 containerd[1483]: time="2025-03-17T17:49:36.890190349Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" Mar 17 17:49:36.891055 containerd[1483]: time="2025-03-17T17:49:36.890283188Z" level=info msg="TearDown network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" successfully" Mar 17 17:49:36.891055 containerd[1483]: time="2025-03-17T17:49:36.890308827Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" returns successfully" Mar 17 17:49:36.890579 systemd[1]: run-netns-cni\x2df31db59d\x2dac0c\x2d9cff\x2d21f4\x2dc129b3be9dda.mount: Deactivated successfully. Mar 17 17:49:36.891268 containerd[1483]: time="2025-03-17T17:49:36.891155251Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" Mar 17 17:49:36.891268 containerd[1483]: time="2025-03-17T17:49:36.891236929Z" level=info msg="TearDown network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" successfully" Mar 17 17:49:36.891268 containerd[1483]: time="2025-03-17T17:49:36.891247289Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" returns successfully" Mar 17 17:49:36.892397 containerd[1483]: time="2025-03-17T17:49:36.892344068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:3,}" Mar 17 17:49:36.894160 kubelet[2562]: I0317 17:49:36.894103 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a" Mar 17 17:49:36.895579 containerd[1483]: time="2025-03-17T17:49:36.895443768Z" level=info msg="StopPodSandbox for \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\"" Mar 17 17:49:36.895642 containerd[1483]: time="2025-03-17T17:49:36.895592405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:3,}" Mar 17 17:49:36.896216 containerd[1483]: time="2025-03-17T17:49:36.895897359Z" level=info msg="Ensure that sandbox 6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a in task-service has been cleanup successfully" Mar 17 17:49:36.896980 containerd[1483]: time="2025-03-17T17:49:36.896809702Z" level=info msg="TearDown network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" successfully" Mar 17 17:49:36.897207 containerd[1483]: time="2025-03-17T17:49:36.897186855Z" level=info msg="StopPodSandbox for \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" returns successfully" Mar 17 17:49:36.897666 containerd[1483]: time="2025-03-17T17:49:36.897642966Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\"" Mar 17 17:49:36.897736 containerd[1483]: time="2025-03-17T17:49:36.897722844Z" level=info msg="TearDown network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" successfully" Mar 17 17:49:36.897758 containerd[1483]: time="2025-03-17T17:49:36.897736684Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" returns successfully" Mar 17 17:49:36.898014 kubelet[2562]: I0317 17:49:36.897992 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2" Mar 17 17:49:36.898051 containerd[1483]: time="2025-03-17T17:49:36.898017719Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" Mar 17 17:49:36.898107 containerd[1483]: time="2025-03-17T17:49:36.898090277Z" level=info msg="TearDown network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" successfully" Mar 17 17:49:36.898132 containerd[1483]: time="2025-03-17T17:49:36.898107797Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" returns successfully" Mar 17 17:49:36.898455 containerd[1483]: time="2025-03-17T17:49:36.898434271Z" level=info msg="StopPodSandbox for \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\"" Mar 17 17:49:36.898584 containerd[1483]: time="2025-03-17T17:49:36.898568228Z" level=info msg="Ensure that sandbox 007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2 in task-service has been cleanup successfully" Mar 17 17:49:36.898780 containerd[1483]: time="2025-03-17T17:49:36.898761344Z" level=info msg="TearDown network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" successfully" Mar 17 17:49:36.898813 containerd[1483]: time="2025-03-17T17:49:36.898780424Z" level=info msg="StopPodSandbox for \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" returns successfully" Mar 17 17:49:36.899214 containerd[1483]: time="2025-03-17T17:49:36.899187376Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\"" Mar 17 17:49:36.899250 containerd[1483]: time="2025-03-17T17:49:36.899209496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:3,}" Mar 17 17:49:36.899345 containerd[1483]: time="2025-03-17T17:49:36.899326493Z" level=info msg="TearDown network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" successfully" Mar 17 17:49:36.899373 containerd[1483]: time="2025-03-17T17:49:36.899345093Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" returns successfully" Mar 17 17:49:36.899677 containerd[1483]: time="2025-03-17T17:49:36.899655407Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" Mar 17 17:49:36.899756 containerd[1483]: time="2025-03-17T17:49:36.899729366Z" level=info msg="TearDown network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" successfully" Mar 17 17:49:36.899756 containerd[1483]: time="2025-03-17T17:49:36.899739685Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" returns successfully" Mar 17 17:49:36.900236 containerd[1483]: time="2025-03-17T17:49:36.900211836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:3,}" Mar 17 17:49:37.226020 containerd[1483]: time="2025-03-17T17:49:37.225451899Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:37.232065 containerd[1483]: time="2025-03-17T17:49:37.232012337Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 17 17:49:37.240678 containerd[1483]: time="2025-03-17T17:49:37.240622896Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:37.247807 containerd[1483]: time="2025-03-17T17:49:37.247759403Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:37.250092 containerd[1483]: time="2025-03-17T17:49:37.249955122Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 3.423650064s" Mar 17 17:49:37.250199 containerd[1483]: time="2025-03-17T17:49:37.250094479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 17 17:49:37.267952 containerd[1483]: time="2025-03-17T17:49:37.267884227Z" level=info msg="CreateContainer within sandbox \"d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 17:49:37.310668 containerd[1483]: time="2025-03-17T17:49:37.310610269Z" level=info msg="CreateContainer within sandbox \"d54d8d0d708cc4a9c1b2feeb9cba503bbc4375b12ca22cc71e6132b4b834f779\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1c88a8d099967c79094306c265402dee11729280559a2ecb12a7cd3199ad8eb4\"" Mar 17 17:49:37.311172 containerd[1483]: time="2025-03-17T17:49:37.311104419Z" level=info msg="StartContainer for \"1c88a8d099967c79094306c265402dee11729280559a2ecb12a7cd3199ad8eb4\"" Mar 17 17:49:37.326247 containerd[1483]: time="2025-03-17T17:49:37.326193697Z" level=error msg="Failed to destroy network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.326969 containerd[1483]: time="2025-03-17T17:49:37.326756487Z" level=error msg="encountered an error cleaning up failed sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.327106 containerd[1483]: time="2025-03-17T17:49:37.327006962Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.327449 kubelet[2562]: E0317 17:49:37.327348 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.327449 kubelet[2562]: E0317 17:49:37.327436 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:37.327755 kubelet[2562]: E0317 17:49:37.327462 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" Mar 17 17:49:37.327755 kubelet[2562]: E0317 17:49:37.327509 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-bh55d_calico-apiserver(cad2d73e-b1c8-4d7b-b3f2-125742f00ac4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" podUID="cad2d73e-b1c8-4d7b-b3f2-125742f00ac4" Mar 17 17:49:37.345365 containerd[1483]: time="2025-03-17T17:49:37.345315900Z" level=error msg="Failed to destroy network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.346960 containerd[1483]: time="2025-03-17T17:49:37.346124445Z" level=error msg="encountered an error cleaning up failed sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.346960 containerd[1483]: time="2025-03-17T17:49:37.346293042Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.347078 kubelet[2562]: E0317 17:49:37.346623 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.347078 kubelet[2562]: E0317 17:49:37.346678 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:37.347078 kubelet[2562]: E0317 17:49:37.346696 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-xxmwz" Mar 17 17:49:37.347160 kubelet[2562]: E0317 17:49:37.346731 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-xxmwz_kube-system(458cde0b-a7e6-4774-81d4-b10558db7a0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-xxmwz" podUID="458cde0b-a7e6-4774-81d4-b10558db7a0b" Mar 17 17:49:37.348497 containerd[1483]: time="2025-03-17T17:49:37.348218606Z" level=error msg="Failed to destroy network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.349229 containerd[1483]: time="2025-03-17T17:49:37.349197908Z" level=error msg="Failed to destroy network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.349389 containerd[1483]: time="2025-03-17T17:49:37.349235107Z" level=error msg="encountered an error cleaning up failed sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.349659 containerd[1483]: time="2025-03-17T17:49:37.349447863Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.349844 containerd[1483]: time="2025-03-17T17:49:37.349814696Z" level=error msg="encountered an error cleaning up failed sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.349970 containerd[1483]: time="2025-03-17T17:49:37.349947734Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.350099 kubelet[2562]: E0317 17:49:37.350064 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.350198 kubelet[2562]: E0317 17:49:37.350116 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:37.350253 kubelet[2562]: E0317 17:49:37.350185 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zkh2s" Mar 17 17:49:37.350253 kubelet[2562]: E0317 17:49:37.350237 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zkh2s_calico-system(1bd3d1fb-9cd7-48fa-86bf-ebd468d40871)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zkh2s" podUID="1bd3d1fb-9cd7-48fa-86bf-ebd468d40871" Mar 17 17:49:37.350536 kubelet[2562]: E0317 17:49:37.350510 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.350589 kubelet[2562]: E0317 17:49:37.350540 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:37.350589 kubelet[2562]: E0317 17:49:37.350555 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-594fp" Mar 17 17:49:37.350639 kubelet[2562]: E0317 17:49:37.350588 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-594fp_kube-system(eddc624d-899a-46cb-92b1-5f442cda8c19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-594fp" podUID="eddc624d-899a-46cb-92b1-5f442cda8c19" Mar 17 17:49:37.355538 containerd[1483]: time="2025-03-17T17:49:37.355493790Z" level=error msg="Failed to destroy network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.356500 containerd[1483]: time="2025-03-17T17:49:37.356449012Z" level=error msg="encountered an error cleaning up failed sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.361272 containerd[1483]: time="2025-03-17T17:49:37.361225123Z" level=error msg="Failed to destroy network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.361578 containerd[1483]: time="2025-03-17T17:49:37.361540317Z" level=error msg="encountered an error cleaning up failed sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.371924 containerd[1483]: time="2025-03-17T17:49:37.371854444Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.372054 containerd[1483]: time="2025-03-17T17:49:37.371984922Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.372213 kubelet[2562]: E0317 17:49:37.372177 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.372265 kubelet[2562]: E0317 17:49:37.372242 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:37.372297 kubelet[2562]: E0317 17:49:37.372262 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" Mar 17 17:49:37.372329 kubelet[2562]: E0317 17:49:37.372297 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d54787977-q7h42_calico-apiserver(67d8c30c-b1cb-4e16-9586-f6358511bb7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" podUID="67d8c30c-b1cb-4e16-9586-f6358511bb7c" Mar 17 17:49:37.372704 kubelet[2562]: E0317 17:49:37.372171 2562 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 17:49:37.372704 kubelet[2562]: E0317 17:49:37.372554 2562 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:37.372704 kubelet[2562]: E0317 17:49:37.372574 2562 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" Mar 17 17:49:37.372802 kubelet[2562]: E0317 17:49:37.372600 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-65cf745c96-whxkh_calico-system(b4b132a3-c152-4a3d-ac76-fe05e87a1881)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" podUID="b4b132a3-c152-4a3d-ac76-fe05e87a1881" Mar 17 17:49:37.399061 systemd[1]: Started cri-containerd-1c88a8d099967c79094306c265402dee11729280559a2ecb12a7cd3199ad8eb4.scope - libcontainer container 1c88a8d099967c79094306c265402dee11729280559a2ecb12a7cd3199ad8eb4. Mar 17 17:49:37.428127 containerd[1483]: time="2025-03-17T17:49:37.428035755Z" level=info msg="StartContainer for \"1c88a8d099967c79094306c265402dee11729280559a2ecb12a7cd3199ad8eb4\" returns successfully" Mar 17 17:49:37.499710 systemd[1]: run-netns-cni\x2d93777c4f\x2dd759\x2d4d3b\x2df167\x2d8ff59c3db96e.mount: Deactivated successfully. Mar 17 17:49:37.500006 systemd[1]: run-netns-cni\x2d0bd2191f\x2d673b\x2da4c4\x2dd72d\x2d0d79c3b270f7.mount: Deactivated successfully. Mar 17 17:49:37.500081 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1643029132.mount: Deactivated successfully. Mar 17 17:49:37.601826 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 17:49:37.601984 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 17:49:37.905079 kubelet[2562]: I0317 17:49:37.904954 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1" Mar 17 17:49:37.905673 containerd[1483]: time="2025-03-17T17:49:37.905639472Z" level=info msg="StopPodSandbox for \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\"" Mar 17 17:49:37.906974 containerd[1483]: time="2025-03-17T17:49:37.906860729Z" level=info msg="Ensure that sandbox d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1 in task-service has been cleanup successfully" Mar 17 17:49:37.908584 containerd[1483]: time="2025-03-17T17:49:37.908546538Z" level=info msg="TearDown network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\" successfully" Mar 17 17:49:37.908584 containerd[1483]: time="2025-03-17T17:49:37.908571698Z" level=info msg="StopPodSandbox for \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\" returns successfully" Mar 17 17:49:37.909041 systemd[1]: run-netns-cni\x2dc860b6cc\x2d9895\x2db3ce\x2ddac6\x2d3389bdc5e3f2.mount: Deactivated successfully. Mar 17 17:49:37.910313 containerd[1483]: time="2025-03-17T17:49:37.909537079Z" level=info msg="StopPodSandbox for \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\"" Mar 17 17:49:37.910313 containerd[1483]: time="2025-03-17T17:49:37.909639678Z" level=info msg="TearDown network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" successfully" Mar 17 17:49:37.910313 containerd[1483]: time="2025-03-17T17:49:37.909649317Z" level=info msg="StopPodSandbox for \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" returns successfully" Mar 17 17:49:37.912998 containerd[1483]: time="2025-03-17T17:49:37.910840135Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\"" Mar 17 17:49:37.912998 containerd[1483]: time="2025-03-17T17:49:37.910946293Z" level=info msg="TearDown network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" successfully" Mar 17 17:49:37.912998 containerd[1483]: time="2025-03-17T17:49:37.910957853Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" returns successfully" Mar 17 17:49:37.913375 containerd[1483]: time="2025-03-17T17:49:37.913328209Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" Mar 17 17:49:37.913475 containerd[1483]: time="2025-03-17T17:49:37.913429247Z" level=info msg="TearDown network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" successfully" Mar 17 17:49:37.913475 containerd[1483]: time="2025-03-17T17:49:37.913441687Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" returns successfully" Mar 17 17:49:37.914980 containerd[1483]: time="2025-03-17T17:49:37.914754262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:4,}" Mar 17 17:49:37.916733 kubelet[2562]: I0317 17:49:37.916708 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d" Mar 17 17:49:37.919266 containerd[1483]: time="2025-03-17T17:49:37.918891505Z" level=info msg="StopPodSandbox for \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\"" Mar 17 17:49:37.919266 containerd[1483]: time="2025-03-17T17:49:37.919097781Z" level=info msg="Ensure that sandbox b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d in task-service has been cleanup successfully" Mar 17 17:49:37.921936 containerd[1483]: time="2025-03-17T17:49:37.919546852Z" level=info msg="TearDown network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\" successfully" Mar 17 17:49:37.921936 containerd[1483]: time="2025-03-17T17:49:37.920356717Z" level=info msg="StopPodSandbox for \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\" returns successfully" Mar 17 17:49:37.921286 systemd[1]: run-netns-cni\x2d0bfff297\x2dab3f\x2d249f\x2d937a\x2d86c77233a9ad.mount: Deactivated successfully. Mar 17 17:49:37.922801 containerd[1483]: time="2025-03-17T17:49:37.922754153Z" level=info msg="StopPodSandbox for \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\"" Mar 17 17:49:37.922876 containerd[1483]: time="2025-03-17T17:49:37.922848951Z" level=info msg="TearDown network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" successfully" Mar 17 17:49:37.922876 containerd[1483]: time="2025-03-17T17:49:37.922866830Z" level=info msg="StopPodSandbox for \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" returns successfully" Mar 17 17:49:37.923528 containerd[1483]: time="2025-03-17T17:49:37.923503939Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\"" Mar 17 17:49:37.923707 containerd[1483]: time="2025-03-17T17:49:37.923652336Z" level=info msg="TearDown network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" successfully" Mar 17 17:49:37.923707 containerd[1483]: time="2025-03-17T17:49:37.923667655Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" returns successfully" Mar 17 17:49:37.923995 kubelet[2562]: I0317 17:49:37.923882 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211" Mar 17 17:49:37.924643 containerd[1483]: time="2025-03-17T17:49:37.924486960Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" Mar 17 17:49:37.924643 containerd[1483]: time="2025-03-17T17:49:37.924574639Z" level=info msg="TearDown network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" successfully" Mar 17 17:49:37.924643 containerd[1483]: time="2025-03-17T17:49:37.924584158Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" returns successfully" Mar 17 17:49:37.924643 containerd[1483]: time="2025-03-17T17:49:37.924594398Z" level=info msg="StopPodSandbox for \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\"" Mar 17 17:49:37.924767 containerd[1483]: time="2025-03-17T17:49:37.924752235Z" level=info msg="Ensure that sandbox 50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211 in task-service has been cleanup successfully" Mar 17 17:49:37.925036 containerd[1483]: time="2025-03-17T17:49:37.924955751Z" level=info msg="TearDown network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\" successfully" Mar 17 17:49:37.925036 containerd[1483]: time="2025-03-17T17:49:37.924978471Z" level=info msg="StopPodSandbox for \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\" returns successfully" Mar 17 17:49:37.926672 containerd[1483]: time="2025-03-17T17:49:37.926625240Z" level=info msg="StopPodSandbox for \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\"" Mar 17 17:49:37.927167 systemd[1]: run-netns-cni\x2dc09baa45\x2d369a\x2d1043\x2ddd09\x2dda9460890abe.mount: Deactivated successfully. Mar 17 17:49:37.927460 containerd[1483]: time="2025-03-17T17:49:37.927260468Z" level=info msg="TearDown network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" successfully" Mar 17 17:49:37.927460 containerd[1483]: time="2025-03-17T17:49:37.927290308Z" level=info msg="StopPodSandbox for \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" returns successfully" Mar 17 17:49:37.929680 containerd[1483]: time="2025-03-17T17:49:37.929443228Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\"" Mar 17 17:49:37.929680 containerd[1483]: time="2025-03-17T17:49:37.929554906Z" level=info msg="TearDown network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" successfully" Mar 17 17:49:37.929680 containerd[1483]: time="2025-03-17T17:49:37.929564305Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" returns successfully" Mar 17 17:49:37.931844 containerd[1483]: time="2025-03-17T17:49:37.931811423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:49:37.932647 containerd[1483]: time="2025-03-17T17:49:37.932429852Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" Mar 17 17:49:37.932647 containerd[1483]: time="2025-03-17T17:49:37.932610128Z" level=info msg="TearDown network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" successfully" Mar 17 17:49:37.932647 containerd[1483]: time="2025-03-17T17:49:37.932622048Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" returns successfully" Mar 17 17:49:37.933945 containerd[1483]: time="2025-03-17T17:49:37.933858025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:4,}" Mar 17 17:49:37.937780 kubelet[2562]: I0317 17:49:37.935378 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.936310539Z" level=info msg="StopPodSandbox for \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\"" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.936465976Z" level=info msg="Ensure that sandbox 0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e in task-service has been cleanup successfully" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.936705092Z" level=info msg="TearDown network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\" successfully" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.936720172Z" level=info msg="StopPodSandbox for \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\" returns successfully" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.937125364Z" level=info msg="StopPodSandbox for \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\"" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.937202683Z" level=info msg="TearDown network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" successfully" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.937212162Z" level=info msg="StopPodSandbox for \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" returns successfully" Mar 17 17:49:37.937872 containerd[1483]: time="2025-03-17T17:49:37.937841071Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\"" Mar 17 17:49:37.938325 containerd[1483]: time="2025-03-17T17:49:37.937937669Z" level=info msg="TearDown network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" successfully" Mar 17 17:49:37.938325 containerd[1483]: time="2025-03-17T17:49:37.937948309Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" returns successfully" Mar 17 17:49:37.938377 containerd[1483]: time="2025-03-17T17:49:37.938344701Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" Mar 17 17:49:37.938512 containerd[1483]: time="2025-03-17T17:49:37.938423420Z" level=info msg="TearDown network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" successfully" Mar 17 17:49:37.938512 containerd[1483]: time="2025-03-17T17:49:37.938433380Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" returns successfully" Mar 17 17:49:37.939958 containerd[1483]: time="2025-03-17T17:49:37.939330723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:4,}" Mar 17 17:49:37.940271 kubelet[2562]: I0317 17:49:37.940253 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53" Mar 17 17:49:37.942090 containerd[1483]: time="2025-03-17T17:49:37.942062552Z" level=info msg="StopPodSandbox for \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\"" Mar 17 17:49:37.942267 containerd[1483]: time="2025-03-17T17:49:37.942249028Z" level=info msg="Ensure that sandbox 13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53 in task-service has been cleanup successfully" Mar 17 17:49:37.942495 containerd[1483]: time="2025-03-17T17:49:37.942477464Z" level=info msg="TearDown network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\" successfully" Mar 17 17:49:37.942533 containerd[1483]: time="2025-03-17T17:49:37.942495104Z" level=info msg="StopPodSandbox for \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\" returns successfully" Mar 17 17:49:37.943143 containerd[1483]: time="2025-03-17T17:49:37.943071733Z" level=info msg="StopPodSandbox for \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\"" Mar 17 17:49:37.943316 containerd[1483]: time="2025-03-17T17:49:37.943153251Z" level=info msg="TearDown network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" successfully" Mar 17 17:49:37.943316 containerd[1483]: time="2025-03-17T17:49:37.943162731Z" level=info msg="StopPodSandbox for \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" returns successfully" Mar 17 17:49:37.943461 containerd[1483]: time="2025-03-17T17:49:37.943435326Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\"" Mar 17 17:49:37.951275 containerd[1483]: time="2025-03-17T17:49:37.951220501Z" level=info msg="TearDown network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" successfully" Mar 17 17:49:37.951275 containerd[1483]: time="2025-03-17T17:49:37.951261820Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" returns successfully" Mar 17 17:49:37.953215 containerd[1483]: time="2025-03-17T17:49:37.953072066Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" Mar 17 17:49:37.953455 containerd[1483]: time="2025-03-17T17:49:37.953418140Z" level=info msg="TearDown network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" successfully" Mar 17 17:49:37.953455 containerd[1483]: time="2025-03-17T17:49:37.953436179Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" returns successfully" Mar 17 17:49:37.954236 containerd[1483]: time="2025-03-17T17:49:37.954155846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:4,}" Mar 17 17:49:37.959900 kubelet[2562]: I0317 17:49:37.959866 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f" Mar 17 17:49:37.960966 containerd[1483]: time="2025-03-17T17:49:37.960856961Z" level=info msg="StopPodSandbox for \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\"" Mar 17 17:49:37.961108 containerd[1483]: time="2025-03-17T17:49:37.961082397Z" level=info msg="Ensure that sandbox 88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f in task-service has been cleanup successfully" Mar 17 17:49:37.966748 containerd[1483]: time="2025-03-17T17:49:37.966590334Z" level=info msg="TearDown network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\" successfully" Mar 17 17:49:37.966748 containerd[1483]: time="2025-03-17T17:49:37.966737891Z" level=info msg="StopPodSandbox for \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\" returns successfully" Mar 17 17:49:37.967890 containerd[1483]: time="2025-03-17T17:49:37.967862790Z" level=info msg="StopPodSandbox for \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\"" Mar 17 17:49:37.968010 containerd[1483]: time="2025-03-17T17:49:37.967977108Z" level=info msg="TearDown network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" successfully" Mar 17 17:49:37.968010 containerd[1483]: time="2025-03-17T17:49:37.967988467Z" level=info msg="StopPodSandbox for \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" returns successfully" Mar 17 17:49:37.968335 containerd[1483]: time="2025-03-17T17:49:37.968287342Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\"" Mar 17 17:49:37.968432 containerd[1483]: time="2025-03-17T17:49:37.968414180Z" level=info msg="TearDown network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" successfully" Mar 17 17:49:37.968461 containerd[1483]: time="2025-03-17T17:49:37.968430219Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" returns successfully" Mar 17 17:49:37.968717 containerd[1483]: time="2025-03-17T17:49:37.968691734Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" Mar 17 17:49:37.968792 containerd[1483]: time="2025-03-17T17:49:37.968773053Z" level=info msg="TearDown network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" successfully" Mar 17 17:49:37.968792 containerd[1483]: time="2025-03-17T17:49:37.968788013Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" returns successfully" Mar 17 17:49:37.969241 containerd[1483]: time="2025-03-17T17:49:37.969217485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:4,}" Mar 17 17:49:38.502454 systemd[1]: run-netns-cni\x2dde68ea93\x2de11c\x2d1ca0\x2d437a\x2d359cffce082b.mount: Deactivated successfully. Mar 17 17:49:38.502819 systemd[1]: run-netns-cni\x2dede70412\x2d95d5\x2d500c\x2dc142\x2ddcda650e30bc.mount: Deactivated successfully. Mar 17 17:49:38.502875 systemd[1]: run-netns-cni\x2d1ace759c\x2dfb86\x2d0657\x2daf27\x2dbcfb5b031d69.mount: Deactivated successfully. Mar 17 17:49:38.578792 systemd-networkd[1414]: cali0417f1d7c90: Link UP Mar 17 17:49:38.579095 systemd-networkd[1414]: cali0417f1d7c90: Gained carrier Mar 17 17:49:38.616219 kubelet[2562]: I0317 17:49:38.616159 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hznnh" podStartSLOduration=2.385765904 podStartE2EDuration="13.616139598s" podCreationTimestamp="2025-03-17 17:49:25 +0000 UTC" firstStartedPulling="2025-03-17 17:49:26.021246237 +0000 UTC m=+14.366839000" lastFinishedPulling="2025-03-17 17:49:37.251619931 +0000 UTC m=+25.597212694" observedRunningTime="2025-03-17 17:49:37.931206395 +0000 UTC m=+26.276799238" watchObservedRunningTime="2025-03-17 17:49:38.616139598 +0000 UTC m=+26.961732361" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.180 [INFO][4381] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.223 [INFO][4381] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--594fp-eth0 coredns-6f6b679f8f- kube-system eddc624d-899a-46cb-92b1-5f442cda8c19 686 0 2025-03-17 17:49:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-594fp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0417f1d7c90 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.224 [INFO][4381] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.497 [INFO][4446] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" HandleID="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Workload="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.518 [INFO][4446] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" HandleID="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Workload="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400031aaf0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-594fp", "timestamp":"2025-03-17 17:49:38.497723221 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.518 [INFO][4446] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.518 [INFO][4446] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.518 [INFO][4446] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.521 [INFO][4446] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.534 [INFO][4446] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.538 [INFO][4446] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.540 [INFO][4446] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.542 [INFO][4446] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.542 [INFO][4446] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.544 [INFO][4446] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1 Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.553 [INFO][4446] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.564 [INFO][4446] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.564 [INFO][4446] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" host="localhost" Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.564 [INFO][4446] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:49:38.617959 containerd[1483]: 2025-03-17 17:49:38.564 [INFO][4446] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" HandleID="k8s-pod-network.97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Workload="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" Mar 17 17:49:38.618753 containerd[1483]: 2025-03-17 17:49:38.569 [INFO][4381] cni-plugin/k8s.go 386: Populated endpoint ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--594fp-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"eddc624d-899a-46cb-92b1-5f442cda8c19", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-594fp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0417f1d7c90", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.618753 containerd[1483]: 2025-03-17 17:49:38.569 [INFO][4381] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" Mar 17 17:49:38.618753 containerd[1483]: 2025-03-17 17:49:38.569 [INFO][4381] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0417f1d7c90 ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" Mar 17 17:49:38.618753 containerd[1483]: 2025-03-17 17:49:38.579 [INFO][4381] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" Mar 17 17:49:38.618753 containerd[1483]: 2025-03-17 17:49:38.579 [INFO][4381] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--594fp-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"eddc624d-899a-46cb-92b1-5f442cda8c19", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1", Pod:"coredns-6f6b679f8f-594fp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0417f1d7c90", MAC:"02:f8:07:ff:9f:90", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.618753 containerd[1483]: 2025-03-17 17:49:38.616 [INFO][4381] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1" Namespace="kube-system" Pod="coredns-6f6b679f8f-594fp" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--594fp-eth0" Mar 17 17:49:38.666540 systemd-networkd[1414]: cali82fd8bbd489: Link UP Mar 17 17:49:38.666757 systemd-networkd[1414]: cali82fd8bbd489: Gained carrier Mar 17 17:49:38.678006 containerd[1483]: time="2025-03-17T17:49:38.677736283Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:38.678006 containerd[1483]: time="2025-03-17T17:49:38.677789162Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:38.678006 containerd[1483]: time="2025-03-17T17:49:38.677800082Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.678006 containerd[1483]: time="2025-03-17T17:49:38.677870041Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.172 [INFO][4372] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.226 [INFO][4372] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0 calico-kube-controllers-65cf745c96- calico-system b4b132a3-c152-4a3d-ac76-fe05e87a1881 688 0 2025-03-17 17:49:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:65cf745c96 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-65cf745c96-whxkh eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali82fd8bbd489 [] []}} ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.227 [INFO][4372] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.513 [INFO][4453] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" HandleID="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Workload="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.529 [INFO][4453] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" HandleID="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Workload="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003e2130), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-65cf745c96-whxkh", "timestamp":"2025-03-17 17:49:38.513357218 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.529 [INFO][4453] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.564 [INFO][4453] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.565 [INFO][4453] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.622 [INFO][4453] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.630 [INFO][4453] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.640 [INFO][4453] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.643 [INFO][4453] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.645 [INFO][4453] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.645 [INFO][4453] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.646 [INFO][4453] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.655 [INFO][4453] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.662 [INFO][4453] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.662 [INFO][4453] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" host="localhost" Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.662 [INFO][4453] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:49:38.679274 containerd[1483]: 2025-03-17 17:49:38.662 [INFO][4453] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" HandleID="k8s-pod-network.fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Workload="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" Mar 17 17:49:38.679771 containerd[1483]: 2025-03-17 17:49:38.665 [INFO][4372] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0", GenerateName:"calico-kube-controllers-65cf745c96-", Namespace:"calico-system", SelfLink:"", UID:"b4b132a3-c152-4a3d-ac76-fe05e87a1881", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65cf745c96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-65cf745c96-whxkh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82fd8bbd489", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.679771 containerd[1483]: 2025-03-17 17:49:38.665 [INFO][4372] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" Mar 17 17:49:38.679771 containerd[1483]: 2025-03-17 17:49:38.665 [INFO][4372] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali82fd8bbd489 ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" Mar 17 17:49:38.679771 containerd[1483]: 2025-03-17 17:49:38.666 [INFO][4372] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" Mar 17 17:49:38.679771 containerd[1483]: 2025-03-17 17:49:38.667 [INFO][4372] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0", GenerateName:"calico-kube-controllers-65cf745c96-", Namespace:"calico-system", SelfLink:"", UID:"b4b132a3-c152-4a3d-ac76-fe05e87a1881", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"65cf745c96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d", Pod:"calico-kube-controllers-65cf745c96-whxkh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82fd8bbd489", MAC:"5e:d0:47:12:f9:90", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.679771 containerd[1483]: 2025-03-17 17:49:38.676 [INFO][4372] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d" Namespace="calico-system" Pod="calico-kube-controllers-65cf745c96-whxkh" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--65cf745c96--whxkh-eth0" Mar 17 17:49:38.704143 systemd[1]: Started cri-containerd-97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1.scope - libcontainer container 97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1. Mar 17 17:49:38.719273 containerd[1483]: time="2025-03-17T17:49:38.718547385Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:38.719273 containerd[1483]: time="2025-03-17T17:49:38.718631943Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:38.719273 containerd[1483]: time="2025-03-17T17:49:38.718649343Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.719273 containerd[1483]: time="2025-03-17T17:49:38.718717582Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.724013 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:49:38.746102 systemd[1]: Started cri-containerd-fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d.scope - libcontainer container fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d. Mar 17 17:49:38.763518 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:49:38.772669 containerd[1483]: time="2025-03-17T17:49:38.772612366Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-594fp,Uid:eddc624d-899a-46cb-92b1-5f442cda8c19,Namespace:kube-system,Attempt:4,} returns sandbox id \"97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1\"" Mar 17 17:49:38.778343 containerd[1483]: time="2025-03-17T17:49:38.778167386Z" level=info msg="CreateContainer within sandbox \"97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:49:38.779979 systemd-networkd[1414]: calie3411e29cdf: Link UP Mar 17 17:49:38.781272 systemd-networkd[1414]: calie3411e29cdf: Gained carrier Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.173 [INFO][4364] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.212 [INFO][4364] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0 coredns-6f6b679f8f- kube-system 458cde0b-a7e6-4774-81d4-b10558db7a0b 691 0 2025-03-17 17:49:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-xxmwz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie3411e29cdf [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.212 [INFO][4364] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.500 [INFO][4440] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" HandleID="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Workload="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.529 [INFO][4440] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" HandleID="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Workload="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d9f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-xxmwz", "timestamp":"2025-03-17 17:49:38.500510571 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.529 [INFO][4440] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.662 [INFO][4440] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.662 [INFO][4440] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.726 [INFO][4440] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.739 [INFO][4440] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.744 [INFO][4440] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.748 [INFO][4440] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.755 [INFO][4440] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.755 [INFO][4440] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.758 [INFO][4440] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05 Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.763 [INFO][4440] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.772 [INFO][4440] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.772 [INFO][4440] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" host="localhost" Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.772 [INFO][4440] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:49:38.795930 containerd[1483]: 2025-03-17 17:49:38.772 [INFO][4440] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" HandleID="k8s-pod-network.a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Workload="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" Mar 17 17:49:38.796479 containerd[1483]: 2025-03-17 17:49:38.776 [INFO][4364] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"458cde0b-a7e6-4774-81d4-b10558db7a0b", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-xxmwz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie3411e29cdf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.796479 containerd[1483]: 2025-03-17 17:49:38.776 [INFO][4364] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" Mar 17 17:49:38.796479 containerd[1483]: 2025-03-17 17:49:38.776 [INFO][4364] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie3411e29cdf ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" Mar 17 17:49:38.796479 containerd[1483]: 2025-03-17 17:49:38.780 [INFO][4364] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" Mar 17 17:49:38.796479 containerd[1483]: 2025-03-17 17:49:38.781 [INFO][4364] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"458cde0b-a7e6-4774-81d4-b10558db7a0b", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05", Pod:"coredns-6f6b679f8f-xxmwz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie3411e29cdf", MAC:"5a:52:af:9c:37:55", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.796479 containerd[1483]: 2025-03-17 17:49:38.792 [INFO][4364] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05" Namespace="kube-system" Pod="coredns-6f6b679f8f-xxmwz" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--xxmwz-eth0" Mar 17 17:49:38.797319 containerd[1483]: time="2025-03-17T17:49:38.797198961Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-65cf745c96-whxkh,Uid:b4b132a3-c152-4a3d-ac76-fe05e87a1881,Namespace:calico-system,Attempt:4,} returns sandbox id \"fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d\"" Mar 17 17:49:38.799050 containerd[1483]: time="2025-03-17T17:49:38.798994289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 17 17:49:38.810226 containerd[1483]: time="2025-03-17T17:49:38.810173406Z" level=info msg="CreateContainer within sandbox \"97164a086d242477abfd15ebe4d3a482add54f7cab56b5b18981b633a7fb93f1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"eac9b1ee629049de071e3f985808ee72c53aa43bdb4dbca4016e14f1fdf4cc4f\"" Mar 17 17:49:38.811407 containerd[1483]: time="2025-03-17T17:49:38.810612918Z" level=info msg="StartContainer for \"eac9b1ee629049de071e3f985808ee72c53aa43bdb4dbca4016e14f1fdf4cc4f\"" Mar 17 17:49:38.819345 containerd[1483]: time="2025-03-17T17:49:38.818791610Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:38.819345 containerd[1483]: time="2025-03-17T17:49:38.818858649Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:38.819345 containerd[1483]: time="2025-03-17T17:49:38.818873449Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.819558 containerd[1483]: time="2025-03-17T17:49:38.819350280Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.843606 systemd[1]: Started cri-containerd-a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05.scope - libcontainer container a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05. Mar 17 17:49:38.844889 systemd[1]: Started cri-containerd-eac9b1ee629049de071e3f985808ee72c53aa43bdb4dbca4016e14f1fdf4cc4f.scope - libcontainer container eac9b1ee629049de071e3f985808ee72c53aa43bdb4dbca4016e14f1fdf4cc4f. Mar 17 17:49:38.868459 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:49:38.872647 systemd-networkd[1414]: cali9e203c00565: Link UP Mar 17 17:49:38.872823 systemd-networkd[1414]: cali9e203c00565: Gained carrier Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:37.982 [INFO][4331] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.125 [INFO][4331] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--zkh2s-eth0 csi-node-driver- calico-system 1bd3d1fb-9cd7-48fa-86bf-ebd468d40871 582 0 2025-03-17 17:49:25 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-zkh2s eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9e203c00565 [] []}} ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.126 [INFO][4331] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-eth0" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.503 [INFO][4418] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" HandleID="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Workload="localhost-k8s-csi--node--driver--zkh2s-eth0" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.529 [INFO][4418] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" HandleID="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Workload="localhost-k8s-csi--node--driver--zkh2s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000211910), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-zkh2s", "timestamp":"2025-03-17 17:49:38.503223522 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.529 [INFO][4418] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.772 [INFO][4418] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.772 [INFO][4418] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.824 [INFO][4418] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.839 [INFO][4418] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.846 [INFO][4418] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.849 [INFO][4418] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.852 [INFO][4418] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.852 [INFO][4418] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.854 [INFO][4418] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65 Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.859 [INFO][4418] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.867 [INFO][4418] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.867 [INFO][4418] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" host="localhost" Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.867 [INFO][4418] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:49:38.896318 containerd[1483]: 2025-03-17 17:49:38.867 [INFO][4418] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" HandleID="k8s-pod-network.f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Workload="localhost-k8s-csi--node--driver--zkh2s-eth0" Mar 17 17:49:38.897129 containerd[1483]: 2025-03-17 17:49:38.870 [INFO][4331] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--zkh2s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1bd3d1fb-9cd7-48fa-86bf-ebd468d40871", ResourceVersion:"582", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-zkh2s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9e203c00565", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.897129 containerd[1483]: 2025-03-17 17:49:38.871 [INFO][4331] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-eth0" Mar 17 17:49:38.897129 containerd[1483]: 2025-03-17 17:49:38.871 [INFO][4331] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e203c00565 ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-eth0" Mar 17 17:49:38.897129 containerd[1483]: 2025-03-17 17:49:38.872 [INFO][4331] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-eth0" Mar 17 17:49:38.897129 containerd[1483]: 2025-03-17 17:49:38.873 [INFO][4331] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--zkh2s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1bd3d1fb-9cd7-48fa-86bf-ebd468d40871", ResourceVersion:"582", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65", Pod:"csi-node-driver-zkh2s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9e203c00565", MAC:"2a:65:86:a3:a4:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:38.897129 containerd[1483]: 2025-03-17 17:49:38.893 [INFO][4331] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65" Namespace="calico-system" Pod="csi-node-driver-zkh2s" WorkloadEndpoint="localhost-k8s-csi--node--driver--zkh2s-eth0" Mar 17 17:49:38.916700 containerd[1483]: time="2025-03-17T17:49:38.916645519Z" level=info msg="StartContainer for \"eac9b1ee629049de071e3f985808ee72c53aa43bdb4dbca4016e14f1fdf4cc4f\" returns successfully" Mar 17 17:49:38.931166 containerd[1483]: time="2025-03-17T17:49:38.931110298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-xxmwz,Uid:458cde0b-a7e6-4774-81d4-b10558db7a0b,Namespace:kube-system,Attempt:4,} returns sandbox id \"a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05\"" Mar 17 17:49:38.938643 containerd[1483]: time="2025-03-17T17:49:38.938589922Z" level=info msg="CreateContainer within sandbox \"a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 17:49:38.967529 containerd[1483]: time="2025-03-17T17:49:38.967383361Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:38.967529 containerd[1483]: time="2025-03-17T17:49:38.967455920Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:38.967529 containerd[1483]: time="2025-03-17T17:49:38.967467600Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.967713 containerd[1483]: time="2025-03-17T17:49:38.967637397Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:38.988442 kubelet[2562]: I0317 17:49:38.986947 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:49:39.014902 systemd[1]: Started cri-containerd-f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65.scope - libcontainer container f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65. Mar 17 17:49:39.017408 systemd-networkd[1414]: cali9942079dc91: Link UP Mar 17 17:49:39.018057 systemd-networkd[1414]: cali9942079dc91: Gained carrier Mar 17 17:49:39.023360 containerd[1483]: time="2025-03-17T17:49:39.023313682Z" level=info msg="CreateContainer within sandbox \"a1ac78c9db4b91478f4548e869c41b4fc9761422d00736b284eb853651171d05\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"549e6787bf17f407362b7c702d2f7e06dfae45a9be00d89fed405ffefcd3e9aa\"" Mar 17 17:49:39.025301 containerd[1483]: time="2025-03-17T17:49:39.025264167Z" level=info msg="StartContainer for \"549e6787bf17f407362b7c702d2f7e06dfae45a9be00d89fed405ffefcd3e9aa\"" Mar 17 17:49:39.041399 kubelet[2562]: I0317 17:49:39.041326 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-594fp" podStartSLOduration=20.040270264 podStartE2EDuration="20.040270264s" podCreationTimestamp="2025-03-17 17:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:49:39.008262665 +0000 UTC m=+27.353855428" watchObservedRunningTime="2025-03-17 17:49:39.040270264 +0000 UTC m=+27.385863027" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.140 [INFO][4388] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.159 [INFO][4388] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0 calico-apiserver-5d54787977- calico-apiserver cad2d73e-b1c8-4d7b-b3f2-125742f00ac4 693 0 2025-03-17 17:49:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d54787977 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5d54787977-bh55d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9942079dc91 [] []}} ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.159 [INFO][4388] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.496 [INFO][4422] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" HandleID="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Workload="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.528 [INFO][4422] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" HandleID="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Workload="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002f2960), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5d54787977-bh55d", "timestamp":"2025-03-17 17:49:38.49612013 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.528 [INFO][4422] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.867 [INFO][4422] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.868 [INFO][4422] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.926 [INFO][4422] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.942 [INFO][4422] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.953 [INFO][4422] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.957 [INFO][4422] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.961 [INFO][4422] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.962 [INFO][4422] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.969 [INFO][4422] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994 Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.979 [INFO][4422] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:38.999 [INFO][4422] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:39.000 [INFO][4422] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" host="localhost" Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:39.000 [INFO][4422] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:49:39.049560 containerd[1483]: 2025-03-17 17:49:39.000 [INFO][4422] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" HandleID="k8s-pod-network.f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Workload="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" Mar 17 17:49:39.050247 containerd[1483]: 2025-03-17 17:49:39.011 [INFO][4388] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0", GenerateName:"calico-apiserver-5d54787977-", Namespace:"calico-apiserver", SelfLink:"", UID:"cad2d73e-b1c8-4d7b-b3f2-125742f00ac4", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d54787977", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5d54787977-bh55d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9942079dc91", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:39.050247 containerd[1483]: 2025-03-17 17:49:39.011 [INFO][4388] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" Mar 17 17:49:39.050247 containerd[1483]: 2025-03-17 17:49:39.011 [INFO][4388] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9942079dc91 ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" Mar 17 17:49:39.050247 containerd[1483]: 2025-03-17 17:49:39.017 [INFO][4388] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" Mar 17 17:49:39.050247 containerd[1483]: 2025-03-17 17:49:39.021 [INFO][4388] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0", GenerateName:"calico-apiserver-5d54787977-", Namespace:"calico-apiserver", SelfLink:"", UID:"cad2d73e-b1c8-4d7b-b3f2-125742f00ac4", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d54787977", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994", Pod:"calico-apiserver-5d54787977-bh55d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9942079dc91", MAC:"7e:da:bc:1b:b5:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:39.050247 containerd[1483]: 2025-03-17 17:49:39.042 [INFO][4388] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-bh55d" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--bh55d-eth0" Mar 17 17:49:39.062428 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:49:39.096179 containerd[1483]: time="2025-03-17T17:49:39.096133885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zkh2s,Uid:1bd3d1fb-9cd7-48fa-86bf-ebd468d40871,Namespace:calico-system,Attempt:4,} returns sandbox id \"f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65\"" Mar 17 17:49:39.108108 systemd-networkd[1414]: cali729eb92bbfa: Link UP Mar 17 17:49:39.108644 systemd-networkd[1414]: cali729eb92bbfa: Gained carrier Mar 17 17:49:39.110189 systemd[1]: Started cri-containerd-549e6787bf17f407362b7c702d2f7e06dfae45a9be00d89fed405ffefcd3e9aa.scope - libcontainer container 549e6787bf17f407362b7c702d2f7e06dfae45a9be00d89fed405ffefcd3e9aa. Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:38.041 [INFO][4346] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:38.126 [INFO][4346] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0 calico-apiserver-5d54787977- calico-apiserver 67d8c30c-b1cb-4e16-9586-f6358511bb7c 692 0 2025-03-17 17:49:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d54787977 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5d54787977-q7h42 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali729eb92bbfa [] []}} ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:38.130 [INFO][4346] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:38.508 [INFO][4415] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" HandleID="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Workload="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:38.529 [INFO][4415] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" HandleID="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Workload="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000331e10), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5d54787977-q7h42", "timestamp":"2025-03-17 17:49:38.508163072 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:38.530 [INFO][4415] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.000 [INFO][4415] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.000 [INFO][4415] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.024 [INFO][4415] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.040 [INFO][4415] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.054 [INFO][4415] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.058 [INFO][4415] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.061 [INFO][4415] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.061 [INFO][4415] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.064 [INFO][4415] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8 Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.071 [INFO][4415] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.085 [INFO][4415] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.085 [INFO][4415] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" host="localhost" Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.085 [INFO][4415] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 17:49:39.127270 containerd[1483]: 2025-03-17 17:49:39.085 [INFO][4415] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" HandleID="k8s-pod-network.d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Workload="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" Mar 17 17:49:39.127815 containerd[1483]: 2025-03-17 17:49:39.104 [INFO][4346] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0", GenerateName:"calico-apiserver-5d54787977-", Namespace:"calico-apiserver", SelfLink:"", UID:"67d8c30c-b1cb-4e16-9586-f6358511bb7c", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d54787977", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5d54787977-q7h42", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali729eb92bbfa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:39.127815 containerd[1483]: 2025-03-17 17:49:39.105 [INFO][4346] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" Mar 17 17:49:39.127815 containerd[1483]: 2025-03-17 17:49:39.105 [INFO][4346] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali729eb92bbfa ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" Mar 17 17:49:39.127815 containerd[1483]: 2025-03-17 17:49:39.109 [INFO][4346] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" Mar 17 17:49:39.127815 containerd[1483]: 2025-03-17 17:49:39.109 [INFO][4346] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0", GenerateName:"calico-apiserver-5d54787977-", Namespace:"calico-apiserver", SelfLink:"", UID:"67d8c30c-b1cb-4e16-9586-f6358511bb7c", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 17, 49, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d54787977", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8", Pod:"calico-apiserver-5d54787977-q7h42", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali729eb92bbfa", MAC:"42:cb:b7:e0:ba:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 17:49:39.127815 containerd[1483]: 2025-03-17 17:49:39.121 [INFO][4346] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8" Namespace="calico-apiserver" Pod="calico-apiserver-5d54787977-q7h42" WorkloadEndpoint="localhost-k8s-calico--apiserver--5d54787977--q7h42-eth0" Mar 17 17:49:39.128355 containerd[1483]: time="2025-03-17T17:49:39.127754771Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:39.128355 containerd[1483]: time="2025-03-17T17:49:39.127813729Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:39.128355 containerd[1483]: time="2025-03-17T17:49:39.127827849Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:39.128355 containerd[1483]: time="2025-03-17T17:49:39.127892848Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:39.163918 systemd[1]: Started cri-containerd-f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994.scope - libcontainer container f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994. Mar 17 17:49:39.166214 containerd[1483]: time="2025-03-17T17:49:39.165984740Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 17:49:39.166214 containerd[1483]: time="2025-03-17T17:49:39.166055099Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 17:49:39.166214 containerd[1483]: time="2025-03-17T17:49:39.166070099Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:39.173096 containerd[1483]: time="2025-03-17T17:49:39.171624201Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 17:49:39.174958 containerd[1483]: time="2025-03-17T17:49:39.174680388Z" level=info msg="StartContainer for \"549e6787bf17f407362b7c702d2f7e06dfae45a9be00d89fed405ffefcd3e9aa\" returns successfully" Mar 17 17:49:39.212119 systemd[1]: Started cri-containerd-d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8.scope - libcontainer container d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8. Mar 17 17:49:39.217892 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:49:39.244512 containerd[1483]: time="2025-03-17T17:49:39.244458364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-bh55d,Uid:cad2d73e-b1c8-4d7b-b3f2-125742f00ac4,Namespace:calico-apiserver,Attempt:4,} returns sandbox id \"f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994\"" Mar 17 17:49:39.249760 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 17:49:39.307494 containerd[1483]: time="2025-03-17T17:49:39.307353702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d54787977-q7h42,Uid:67d8c30c-b1cb-4e16-9586-f6358511bb7c,Namespace:calico-apiserver,Attempt:4,} returns sandbox id \"d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8\"" Mar 17 17:49:39.322969 kernel: bpftool[4994]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 17 17:49:39.474774 systemd-networkd[1414]: vxlan.calico: Link UP Mar 17 17:49:39.474784 systemd-networkd[1414]: vxlan.calico: Gained carrier Mar 17 17:49:39.904791 systemd-networkd[1414]: cali82fd8bbd489: Gained IPv6LL Mar 17 17:49:40.035794 kubelet[2562]: I0317 17:49:40.035580 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-xxmwz" podStartSLOduration=21.035563313 podStartE2EDuration="21.035563313s" podCreationTimestamp="2025-03-17 17:49:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 17:49:40.034730608 +0000 UTC m=+28.380323371" watchObservedRunningTime="2025-03-17 17:49:40.035563313 +0000 UTC m=+28.381156156" Mar 17 17:49:40.286154 systemd-networkd[1414]: cali0417f1d7c90: Gained IPv6LL Mar 17 17:49:40.435587 containerd[1483]: time="2025-03-17T17:49:40.435283964Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:40.441941 containerd[1483]: time="2025-03-17T17:49:40.436571822Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 17 17:49:40.441941 containerd[1483]: time="2025-03-17T17:49:40.438092557Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:40.441941 containerd[1483]: time="2025-03-17T17:49:40.440905229Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:40.442074 containerd[1483]: time="2025-03-17T17:49:40.441977371Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 1.642939603s" Mar 17 17:49:40.442074 containerd[1483]: time="2025-03-17T17:49:40.442008570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 17 17:49:40.443954 containerd[1483]: time="2025-03-17T17:49:40.443634982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 17 17:49:40.452197 systemd[1]: Started sshd@7-10.0.0.115:22-10.0.0.1:59402.service - OpenSSH per-connection server daemon (10.0.0.1:59402). Mar 17 17:49:40.453409 containerd[1483]: time="2025-03-17T17:49:40.452427313Z" level=info msg="CreateContainer within sandbox \"fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 17:49:40.479981 containerd[1483]: time="2025-03-17T17:49:40.478269234Z" level=info msg="CreateContainer within sandbox \"fd35fd93cfd214d27ef908a461bce3d78dcf5af9017d8dad1330befbd1dafe7d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0058a7b723927f62c900a233be5b4434540ca22b9d14cc3633d5691970631508\"" Mar 17 17:49:40.479981 containerd[1483]: time="2025-03-17T17:49:40.478881424Z" level=info msg="StartContainer for \"0058a7b723927f62c900a233be5b4434540ca22b9d14cc3633d5691970631508\"" Mar 17 17:49:40.479053 systemd-networkd[1414]: calie3411e29cdf: Gained IPv6LL Mar 17 17:49:40.519537 sshd[5089]: Accepted publickey for core from 10.0.0.1 port 59402 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:49:40.520810 sshd-session[5089]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:49:40.526135 systemd[1]: Started cri-containerd-0058a7b723927f62c900a233be5b4434540ca22b9d14cc3633d5691970631508.scope - libcontainer container 0058a7b723927f62c900a233be5b4434540ca22b9d14cc3633d5691970631508. Mar 17 17:49:40.531138 systemd-logind[1469]: New session 8 of user core. Mar 17 17:49:40.532458 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 17 17:49:40.543050 systemd-networkd[1414]: cali729eb92bbfa: Gained IPv6LL Mar 17 17:49:40.564643 containerd[1483]: time="2025-03-17T17:49:40.564584648Z" level=info msg="StartContainer for \"0058a7b723927f62c900a233be5b4434540ca22b9d14cc3633d5691970631508\" returns successfully" Mar 17 17:49:40.670434 systemd-networkd[1414]: cali9e203c00565: Gained IPv6LL Mar 17 17:49:40.817190 sshd[5117]: Connection closed by 10.0.0.1 port 59402 Mar 17 17:49:40.817717 sshd-session[5089]: pam_unix(sshd:session): session closed for user core Mar 17 17:49:40.820202 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 17:49:40.821494 systemd[1]: sshd@7-10.0.0.115:22-10.0.0.1:59402.service: Deactivated successfully. Mar 17 17:49:40.823515 systemd-logind[1469]: Session 8 logged out. Waiting for processes to exit. Mar 17 17:49:40.824701 systemd-logind[1469]: Removed session 8. Mar 17 17:49:40.926110 systemd-networkd[1414]: cali9942079dc91: Gained IPv6LL Mar 17 17:49:41.020679 kubelet[2562]: I0317 17:49:41.020585 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-65cf745c96-whxkh" podStartSLOduration=14.375667105 podStartE2EDuration="16.020553234s" podCreationTimestamp="2025-03-17 17:49:25 +0000 UTC" firstStartedPulling="2025-03-17 17:49:38.798525217 +0000 UTC m=+27.144117980" lastFinishedPulling="2025-03-17 17:49:40.443411386 +0000 UTC m=+28.789004109" observedRunningTime="2025-03-17 17:49:41.020333438 +0000 UTC m=+29.365926241" watchObservedRunningTime="2025-03-17 17:49:41.020553234 +0000 UTC m=+29.366145997" Mar 17 17:49:41.118105 systemd-networkd[1414]: vxlan.calico: Gained IPv6LL Mar 17 17:49:41.715238 containerd[1483]: time="2025-03-17T17:49:41.715157805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:41.717333 containerd[1483]: time="2025-03-17T17:49:41.717252971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 17 17:49:41.718459 containerd[1483]: time="2025-03-17T17:49:41.718131156Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:41.721644 containerd[1483]: time="2025-03-17T17:49:41.721603939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:41.722574 containerd[1483]: time="2025-03-17T17:49:41.722104091Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.278435069s" Mar 17 17:49:41.722574 containerd[1483]: time="2025-03-17T17:49:41.722130891Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 17 17:49:41.723225 containerd[1483]: time="2025-03-17T17:49:41.723206473Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:49:41.724209 containerd[1483]: time="2025-03-17T17:49:41.724182017Z" level=info msg="CreateContainer within sandbox \"f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 17:49:41.747945 containerd[1483]: time="2025-03-17T17:49:41.747890027Z" level=info msg="CreateContainer within sandbox \"f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"630d73aa33f83b55edf9becd1f5137b25f169151628606ac6c531422574d8a55\"" Mar 17 17:49:41.749091 containerd[1483]: time="2025-03-17T17:49:41.749060168Z" level=info msg="StartContainer for \"630d73aa33f83b55edf9becd1f5137b25f169151628606ac6c531422574d8a55\"" Mar 17 17:49:41.791140 systemd[1]: Started cri-containerd-630d73aa33f83b55edf9becd1f5137b25f169151628606ac6c531422574d8a55.scope - libcontainer container 630d73aa33f83b55edf9becd1f5137b25f169151628606ac6c531422574d8a55. Mar 17 17:49:41.820368 containerd[1483]: time="2025-03-17T17:49:41.820311595Z" level=info msg="StartContainer for \"630d73aa33f83b55edf9becd1f5137b25f169151628606ac6c531422574d8a55\" returns successfully" Mar 17 17:49:42.017903 kubelet[2562]: I0317 17:49:42.017556 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:49:43.163404 containerd[1483]: time="2025-03-17T17:49:43.163347815Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:43.164142 containerd[1483]: time="2025-03-17T17:49:43.164090084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 17 17:49:43.164991 containerd[1483]: time="2025-03-17T17:49:43.164939510Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:43.167154 containerd[1483]: time="2025-03-17T17:49:43.167115797Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:43.167985 containerd[1483]: time="2025-03-17T17:49:43.167896585Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 1.444587234s" Mar 17 17:49:43.167985 containerd[1483]: time="2025-03-17T17:49:43.167945264Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:49:43.169260 containerd[1483]: time="2025-03-17T17:49:43.169002088Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 17 17:49:43.170726 containerd[1483]: time="2025-03-17T17:49:43.170695742Z" level=info msg="CreateContainer within sandbox \"f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:49:43.180916 containerd[1483]: time="2025-03-17T17:49:43.180716747Z" level=info msg="CreateContainer within sandbox \"f43f64453f7bf51f8d70bb25ddada6244516e91507fa56cf8831643e1a915994\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"40f25e956155f72136ed96991faa6d6342bbc121f99c07ba894de29c9340e7b1\"" Mar 17 17:49:43.181636 containerd[1483]: time="2025-03-17T17:49:43.181603333Z" level=info msg="StartContainer for \"40f25e956155f72136ed96991faa6d6342bbc121f99c07ba894de29c9340e7b1\"" Mar 17 17:49:43.215089 systemd[1]: Started cri-containerd-40f25e956155f72136ed96991faa6d6342bbc121f99c07ba894de29c9340e7b1.scope - libcontainer container 40f25e956155f72136ed96991faa6d6342bbc121f99c07ba894de29c9340e7b1. Mar 17 17:49:43.262745 containerd[1483]: time="2025-03-17T17:49:43.262704081Z" level=info msg="StartContainer for \"40f25e956155f72136ed96991faa6d6342bbc121f99c07ba894de29c9340e7b1\" returns successfully" Mar 17 17:49:43.468588 containerd[1483]: time="2025-03-17T17:49:43.468476983Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:43.471606 containerd[1483]: time="2025-03-17T17:49:43.471558976Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 17 17:49:43.473636 containerd[1483]: time="2025-03-17T17:49:43.473460946Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 304.4291ms" Mar 17 17:49:43.473636 containerd[1483]: time="2025-03-17T17:49:43.473509946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 17 17:49:43.476924 containerd[1483]: time="2025-03-17T17:49:43.476086626Z" level=info msg="CreateContainer within sandbox \"d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 17:49:43.477581 containerd[1483]: time="2025-03-17T17:49:43.477417125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 17 17:49:43.489999 containerd[1483]: time="2025-03-17T17:49:43.489956052Z" level=info msg="CreateContainer within sandbox \"d75a1267930f1ea092c01c01dc8e7968376516999e83173513456ff59c9fdeb8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4616c9b2b7fb5273363266bc2d290d25dc6e60b1363c9a139690199c079a7980\"" Mar 17 17:49:43.490661 containerd[1483]: time="2025-03-17T17:49:43.490581922Z" level=info msg="StartContainer for \"4616c9b2b7fb5273363266bc2d290d25dc6e60b1363c9a139690199c079a7980\"" Mar 17 17:49:43.519131 systemd[1]: Started cri-containerd-4616c9b2b7fb5273363266bc2d290d25dc6e60b1363c9a139690199c079a7980.scope - libcontainer container 4616c9b2b7fb5273363266bc2d290d25dc6e60b1363c9a139690199c079a7980. Mar 17 17:49:43.561793 containerd[1483]: time="2025-03-17T17:49:43.561668864Z" level=info msg="StartContainer for \"4616c9b2b7fb5273363266bc2d290d25dc6e60b1363c9a139690199c079a7980\" returns successfully" Mar 17 17:49:44.060114 kubelet[2562]: I0317 17:49:44.057753 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d54787977-q7h42" podStartSLOduration=13.894676171 podStartE2EDuration="18.057735192s" podCreationTimestamp="2025-03-17 17:49:26 +0000 UTC" firstStartedPulling="2025-03-17 17:49:39.311612067 +0000 UTC m=+27.657204830" lastFinishedPulling="2025-03-17 17:49:43.474671088 +0000 UTC m=+31.820263851" observedRunningTime="2025-03-17 17:49:44.056060217 +0000 UTC m=+32.401652980" watchObservedRunningTime="2025-03-17 17:49:44.057735192 +0000 UTC m=+32.403327955" Mar 17 17:49:44.061691 kubelet[2562]: I0317 17:49:44.060716 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5d54787977-bh55d" podStartSLOduration=14.137857155 podStartE2EDuration="18.060697828s" podCreationTimestamp="2025-03-17 17:49:26 +0000 UTC" firstStartedPulling="2025-03-17 17:49:39.246025137 +0000 UTC m=+27.591617860" lastFinishedPulling="2025-03-17 17:49:43.16886577 +0000 UTC m=+31.514458533" observedRunningTime="2025-03-17 17:49:44.044934863 +0000 UTC m=+32.390527626" watchObservedRunningTime="2025-03-17 17:49:44.060697828 +0000 UTC m=+32.406290591" Mar 17 17:49:44.701247 containerd[1483]: time="2025-03-17T17:49:44.701194966Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:44.705938 containerd[1483]: time="2025-03-17T17:49:44.703091618Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 17 17:49:44.705938 containerd[1483]: time="2025-03-17T17:49:44.704165962Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:44.708214 containerd[1483]: time="2025-03-17T17:49:44.708136423Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 17 17:49:44.709365 containerd[1483]: time="2025-03-17T17:49:44.709232486Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.231788841s" Mar 17 17:49:44.709365 containerd[1483]: time="2025-03-17T17:49:44.709265406Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 17 17:49:44.711189 containerd[1483]: time="2025-03-17T17:49:44.711161617Z" level=info msg="CreateContainer within sandbox \"f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 17:49:44.727896 containerd[1483]: time="2025-03-17T17:49:44.727849888Z" level=info msg="CreateContainer within sandbox \"f24eb210348efc254998e656397378e24565a42028b66ddc673d9c5177d5ea65\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"237483937d644c5c8577a9fba9e56445d348d6865f3135a0a03e63f44b27edcc\"" Mar 17 17:49:44.728765 containerd[1483]: time="2025-03-17T17:49:44.728735154Z" level=info msg="StartContainer for \"237483937d644c5c8577a9fba9e56445d348d6865f3135a0a03e63f44b27edcc\"" Mar 17 17:49:44.766103 systemd[1]: Started cri-containerd-237483937d644c5c8577a9fba9e56445d348d6865f3135a0a03e63f44b27edcc.scope - libcontainer container 237483937d644c5c8577a9fba9e56445d348d6865f3135a0a03e63f44b27edcc. Mar 17 17:49:44.829496 containerd[1483]: time="2025-03-17T17:49:44.829444888Z" level=info msg="StartContainer for \"237483937d644c5c8577a9fba9e56445d348d6865f3135a0a03e63f44b27edcc\" returns successfully" Mar 17 17:49:45.050328 kubelet[2562]: I0317 17:49:45.050285 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:49:45.051033 kubelet[2562]: I0317 17:49:45.050677 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:49:45.818515 kubelet[2562]: I0317 17:49:45.818000 2562 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 17:49:45.824669 kubelet[2562]: I0317 17:49:45.824638 2562 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 17:49:45.831545 systemd[1]: Started sshd@8-10.0.0.115:22-10.0.0.1:40812.service - OpenSSH per-connection server daemon (10.0.0.1:40812). Mar 17 17:49:45.912286 sshd[5337]: Accepted publickey for core from 10.0.0.1 port 40812 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:49:45.914494 sshd-session[5337]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:49:45.918570 systemd-logind[1469]: New session 9 of user core. Mar 17 17:49:45.925111 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 17 17:49:46.141554 sshd[5339]: Connection closed by 10.0.0.1 port 40812 Mar 17 17:49:46.142188 sshd-session[5337]: pam_unix(sshd:session): session closed for user core Mar 17 17:49:46.145993 systemd[1]: sshd@8-10.0.0.115:22-10.0.0.1:40812.service: Deactivated successfully. Mar 17 17:49:46.147984 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 17:49:46.148630 systemd-logind[1469]: Session 9 logged out. Waiting for processes to exit. Mar 17 17:49:46.149434 systemd-logind[1469]: Removed session 9. Mar 17 17:49:48.454876 kubelet[2562]: I0317 17:49:48.454827 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:49:48.518516 systemd[1]: run-containerd-runc-k8s.io-1c88a8d099967c79094306c265402dee11729280559a2ecb12a7cd3199ad8eb4-runc.tZvf8K.mount: Deactivated successfully. Mar 17 17:49:48.591784 kubelet[2562]: I0317 17:49:48.591715 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zkh2s" podStartSLOduration=17.983352 podStartE2EDuration="23.591698767s" podCreationTimestamp="2025-03-17 17:49:25 +0000 UTC" firstStartedPulling="2025-03-17 17:49:39.101691907 +0000 UTC m=+27.447284670" lastFinishedPulling="2025-03-17 17:49:44.710038674 +0000 UTC m=+33.055631437" observedRunningTime="2025-03-17 17:49:45.062994943 +0000 UTC m=+33.408587706" watchObservedRunningTime="2025-03-17 17:49:48.591698767 +0000 UTC m=+36.937291530" Mar 17 17:49:51.154226 systemd[1]: Started sshd@9-10.0.0.115:22-10.0.0.1:40820.service - OpenSSH per-connection server daemon (10.0.0.1:40820). Mar 17 17:49:51.214333 sshd[5410]: Accepted publickey for core from 10.0.0.1 port 40820 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:49:51.215761 sshd-session[5410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:49:51.220599 systemd-logind[1469]: New session 10 of user core. Mar 17 17:49:51.233102 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 17 17:49:51.396245 sshd[5412]: Connection closed by 10.0.0.1 port 40820 Mar 17 17:49:51.396655 sshd-session[5410]: pam_unix(sshd:session): session closed for user core Mar 17 17:49:51.413403 systemd[1]: sshd@9-10.0.0.115:22-10.0.0.1:40820.service: Deactivated successfully. Mar 17 17:49:51.417092 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 17:49:51.418717 systemd-logind[1469]: Session 10 logged out. Waiting for processes to exit. Mar 17 17:49:51.429265 systemd[1]: Started sshd@10-10.0.0.115:22-10.0.0.1:40826.service - OpenSSH per-connection server daemon (10.0.0.1:40826). Mar 17 17:49:51.430710 systemd-logind[1469]: Removed session 10. Mar 17 17:49:51.471253 sshd[5425]: Accepted publickey for core from 10.0.0.1 port 40826 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:49:51.472849 sshd-session[5425]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:49:51.477956 systemd-logind[1469]: New session 11 of user core. Mar 17 17:49:51.489162 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 17 17:49:51.712629 sshd[5428]: Connection closed by 10.0.0.1 port 40826 Mar 17 17:49:51.714128 sshd-session[5425]: pam_unix(sshd:session): session closed for user core Mar 17 17:49:51.727691 systemd[1]: sshd@10-10.0.0.115:22-10.0.0.1:40826.service: Deactivated successfully. Mar 17 17:49:51.732730 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 17:49:51.734468 systemd-logind[1469]: Session 11 logged out. Waiting for processes to exit. Mar 17 17:49:51.747569 systemd[1]: Started sshd@11-10.0.0.115:22-10.0.0.1:40842.service - OpenSSH per-connection server daemon (10.0.0.1:40842). Mar 17 17:49:51.749186 systemd-logind[1469]: Removed session 11. Mar 17 17:49:51.791626 sshd[5439]: Accepted publickey for core from 10.0.0.1 port 40842 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:49:51.792227 sshd-session[5439]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:49:51.797728 systemd-logind[1469]: New session 12 of user core. Mar 17 17:49:51.809081 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 17 17:49:51.953932 sshd[5442]: Connection closed by 10.0.0.1 port 40842 Mar 17 17:49:51.953897 sshd-session[5439]: pam_unix(sshd:session): session closed for user core Mar 17 17:49:51.957979 systemd[1]: sshd@11-10.0.0.115:22-10.0.0.1:40842.service: Deactivated successfully. Mar 17 17:49:51.959651 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 17:49:51.961371 systemd-logind[1469]: Session 12 logged out. Waiting for processes to exit. Mar 17 17:49:51.962979 systemd-logind[1469]: Removed session 12. Mar 17 17:49:56.969271 systemd[1]: Started sshd@12-10.0.0.115:22-10.0.0.1:59868.service - OpenSSH per-connection server daemon (10.0.0.1:59868). Mar 17 17:49:57.018243 sshd[5460]: Accepted publickey for core from 10.0.0.1 port 59868 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:49:57.019610 sshd-session[5460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:49:57.023144 systemd-logind[1469]: New session 13 of user core. Mar 17 17:49:57.032057 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 17 17:49:57.203018 sshd[5462]: Connection closed by 10.0.0.1 port 59868 Mar 17 17:49:57.203413 sshd-session[5460]: pam_unix(sshd:session): session closed for user core Mar 17 17:49:57.210627 systemd[1]: sshd@12-10.0.0.115:22-10.0.0.1:59868.service: Deactivated successfully. Mar 17 17:49:57.212812 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 17:49:57.213541 systemd-logind[1469]: Session 13 logged out. Waiting for processes to exit. Mar 17 17:49:57.214527 systemd-logind[1469]: Removed session 13. Mar 17 17:49:57.687017 kubelet[2562]: I0317 17:49:57.686704 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:50:02.215835 systemd[1]: Started sshd@13-10.0.0.115:22-10.0.0.1:59882.service - OpenSSH per-connection server daemon (10.0.0.1:59882). Mar 17 17:50:02.281022 sshd[5526]: Accepted publickey for core from 10.0.0.1 port 59882 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:02.282425 sshd-session[5526]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:02.289049 systemd-logind[1469]: New session 14 of user core. Mar 17 17:50:02.305171 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 17 17:50:02.490985 sshd[5528]: Connection closed by 10.0.0.1 port 59882 Mar 17 17:50:02.491385 sshd-session[5526]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:02.504620 systemd[1]: sshd@13-10.0.0.115:22-10.0.0.1:59882.service: Deactivated successfully. Mar 17 17:50:02.507545 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 17:50:02.509643 systemd-logind[1469]: Session 14 logged out. Waiting for processes to exit. Mar 17 17:50:02.520440 systemd[1]: Started sshd@14-10.0.0.115:22-10.0.0.1:57920.service - OpenSSH per-connection server daemon (10.0.0.1:57920). Mar 17 17:50:02.521261 systemd-logind[1469]: Removed session 14. Mar 17 17:50:02.564933 sshd[5540]: Accepted publickey for core from 10.0.0.1 port 57920 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:02.566381 sshd-session[5540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:02.571497 systemd-logind[1469]: New session 15 of user core. Mar 17 17:50:02.579100 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 17 17:50:02.825859 sshd[5543]: Connection closed by 10.0.0.1 port 57920 Mar 17 17:50:02.828007 sshd-session[5540]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:02.841899 systemd[1]: sshd@14-10.0.0.115:22-10.0.0.1:57920.service: Deactivated successfully. Mar 17 17:50:02.844026 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 17:50:02.844776 systemd-logind[1469]: Session 15 logged out. Waiting for processes to exit. Mar 17 17:50:02.855504 systemd[1]: Started sshd@15-10.0.0.115:22-10.0.0.1:57936.service - OpenSSH per-connection server daemon (10.0.0.1:57936). Mar 17 17:50:02.856307 systemd-logind[1469]: Removed session 15. Mar 17 17:50:02.900112 sshd[5554]: Accepted publickey for core from 10.0.0.1 port 57936 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:02.901488 sshd-session[5554]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:02.908234 systemd-logind[1469]: New session 16 of user core. Mar 17 17:50:02.920117 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 17 17:50:04.335392 sshd[5557]: Connection closed by 10.0.0.1 port 57936 Mar 17 17:50:04.336161 sshd-session[5554]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:04.352761 systemd[1]: sshd@15-10.0.0.115:22-10.0.0.1:57936.service: Deactivated successfully. Mar 17 17:50:04.358659 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 17:50:04.358864 systemd[1]: session-16.scope: Consumed 484ms CPU time, 73.1M memory peak. Mar 17 17:50:04.362159 systemd-logind[1469]: Session 16 logged out. Waiting for processes to exit. Mar 17 17:50:04.375240 systemd[1]: Started sshd@16-10.0.0.115:22-10.0.0.1:57942.service - OpenSSH per-connection server daemon (10.0.0.1:57942). Mar 17 17:50:04.379048 systemd-logind[1469]: Removed session 16. Mar 17 17:50:04.426898 sshd[5575]: Accepted publickey for core from 10.0.0.1 port 57942 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:04.428220 sshd-session[5575]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:04.432512 systemd-logind[1469]: New session 17 of user core. Mar 17 17:50:04.445115 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 17 17:50:04.785569 sshd[5580]: Connection closed by 10.0.0.1 port 57942 Mar 17 17:50:04.787441 sshd-session[5575]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:04.796360 systemd[1]: sshd@16-10.0.0.115:22-10.0.0.1:57942.service: Deactivated successfully. Mar 17 17:50:04.798491 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 17:50:04.799512 systemd-logind[1469]: Session 17 logged out. Waiting for processes to exit. Mar 17 17:50:04.814426 systemd[1]: Started sshd@17-10.0.0.115:22-10.0.0.1:57956.service - OpenSSH per-connection server daemon (10.0.0.1:57956). Mar 17 17:50:04.815520 systemd-logind[1469]: Removed session 17. Mar 17 17:50:04.855391 sshd[5591]: Accepted publickey for core from 10.0.0.1 port 57956 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:04.857007 sshd-session[5591]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:04.860951 systemd-logind[1469]: New session 18 of user core. Mar 17 17:50:04.868122 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 17 17:50:05.030433 sshd[5594]: Connection closed by 10.0.0.1 port 57956 Mar 17 17:50:05.030447 sshd-session[5591]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:05.034122 systemd[1]: sshd@17-10.0.0.115:22-10.0.0.1:57956.service: Deactivated successfully. Mar 17 17:50:05.035927 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 17:50:05.037021 systemd-logind[1469]: Session 18 logged out. Waiting for processes to exit. Mar 17 17:50:05.038342 systemd-logind[1469]: Removed session 18. Mar 17 17:50:10.048461 systemd[1]: Started sshd@18-10.0.0.115:22-10.0.0.1:57972.service - OpenSSH per-connection server daemon (10.0.0.1:57972). Mar 17 17:50:10.104244 sshd[5613]: Accepted publickey for core from 10.0.0.1 port 57972 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:10.105579 sshd-session[5613]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:10.111390 systemd-logind[1469]: New session 19 of user core. Mar 17 17:50:10.119160 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 17 17:50:10.254339 sshd[5615]: Connection closed by 10.0.0.1 port 57972 Mar 17 17:50:10.255117 sshd-session[5613]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:10.259644 systemd[1]: sshd@18-10.0.0.115:22-10.0.0.1:57972.service: Deactivated successfully. Mar 17 17:50:10.262685 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 17:50:10.264193 systemd-logind[1469]: Session 19 logged out. Waiting for processes to exit. Mar 17 17:50:10.265238 systemd-logind[1469]: Removed session 19. Mar 17 17:50:11.733824 containerd[1483]: time="2025-03-17T17:50:11.733772777Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" Mar 17 17:50:11.752183 containerd[1483]: time="2025-03-17T17:50:11.750407072Z" level=info msg="TearDown network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" successfully" Mar 17 17:50:11.752183 containerd[1483]: time="2025-03-17T17:50:11.750442991Z" level=info msg="StopPodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" returns successfully" Mar 17 17:50:11.752545 containerd[1483]: time="2025-03-17T17:50:11.752516058Z" level=info msg="RemovePodSandbox for \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" Mar 17 17:50:11.755086 containerd[1483]: time="2025-03-17T17:50:11.755036242Z" level=info msg="Forcibly stopping sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\"" Mar 17 17:50:11.755178 containerd[1483]: time="2025-03-17T17:50:11.755153561Z" level=info msg="TearDown network for sandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" successfully" Mar 17 17:50:11.758044 containerd[1483]: time="2025-03-17T17:50:11.758002223Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.758121 containerd[1483]: time="2025-03-17T17:50:11.758070703Z" level=info msg="RemovePodSandbox \"2ff5d833f75cbcf0acd81ed81c7aea95e2896a42079bebced44ebff88ee58feb\" returns successfully" Mar 17 17:50:11.758585 containerd[1483]: time="2025-03-17T17:50:11.758533540Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\"" Mar 17 17:50:11.758649 containerd[1483]: time="2025-03-17T17:50:11.758624699Z" level=info msg="TearDown network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" successfully" Mar 17 17:50:11.758649 containerd[1483]: time="2025-03-17T17:50:11.758634259Z" level=info msg="StopPodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" returns successfully" Mar 17 17:50:11.760393 containerd[1483]: time="2025-03-17T17:50:11.758992457Z" level=info msg="RemovePodSandbox for \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\"" Mar 17 17:50:11.760393 containerd[1483]: time="2025-03-17T17:50:11.759024257Z" level=info msg="Forcibly stopping sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\"" Mar 17 17:50:11.760393 containerd[1483]: time="2025-03-17T17:50:11.759091376Z" level=info msg="TearDown network for sandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" successfully" Mar 17 17:50:11.762192 containerd[1483]: time="2025-03-17T17:50:11.762145517Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.762357 containerd[1483]: time="2025-03-17T17:50:11.762336356Z" level=info msg="RemovePodSandbox \"50e4e5337991507c60394bbbc126d7990db428c14c953b2b19bea9a9b0a319cc\" returns successfully" Mar 17 17:50:11.762769 containerd[1483]: time="2025-03-17T17:50:11.762737233Z" level=info msg="StopPodSandbox for \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\"" Mar 17 17:50:11.762848 containerd[1483]: time="2025-03-17T17:50:11.762831553Z" level=info msg="TearDown network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" successfully" Mar 17 17:50:11.762848 containerd[1483]: time="2025-03-17T17:50:11.762845993Z" level=info msg="StopPodSandbox for \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" returns successfully" Mar 17 17:50:11.763260 containerd[1483]: time="2025-03-17T17:50:11.763190830Z" level=info msg="RemovePodSandbox for \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\"" Mar 17 17:50:11.763260 containerd[1483]: time="2025-03-17T17:50:11.763255710Z" level=info msg="Forcibly stopping sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\"" Mar 17 17:50:11.763355 containerd[1483]: time="2025-03-17T17:50:11.763323950Z" level=info msg="TearDown network for sandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" successfully" Mar 17 17:50:11.768634 containerd[1483]: time="2025-03-17T17:50:11.768583676Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.768727 containerd[1483]: time="2025-03-17T17:50:11.768657076Z" level=info msg="RemovePodSandbox \"007d92b7ad9dfd0435208a9a6b9ebc73f8721531265d24c44ae49a4d3e4ed3f2\" returns successfully" Mar 17 17:50:11.769155 containerd[1483]: time="2025-03-17T17:50:11.769131953Z" level=info msg="StopPodSandbox for \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\"" Mar 17 17:50:11.769300 containerd[1483]: time="2025-03-17T17:50:11.769262712Z" level=info msg="TearDown network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\" successfully" Mar 17 17:50:11.769359 containerd[1483]: time="2025-03-17T17:50:11.769296592Z" level=info msg="StopPodSandbox for \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\" returns successfully" Mar 17 17:50:11.771141 containerd[1483]: time="2025-03-17T17:50:11.769643109Z" level=info msg="RemovePodSandbox for \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\"" Mar 17 17:50:11.771141 containerd[1483]: time="2025-03-17T17:50:11.769673709Z" level=info msg="Forcibly stopping sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\"" Mar 17 17:50:11.771141 containerd[1483]: time="2025-03-17T17:50:11.769788309Z" level=info msg="TearDown network for sandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\" successfully" Mar 17 17:50:11.772567 containerd[1483]: time="2025-03-17T17:50:11.772533651Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.772711 containerd[1483]: time="2025-03-17T17:50:11.772692290Z" level=info msg="RemovePodSandbox \"d4dc2ceef3fd0f6b550939d59d537b9c640d06c04adc2a45d571e9a3551521d1\" returns successfully" Mar 17 17:50:11.773237 containerd[1483]: time="2025-03-17T17:50:11.773208407Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" Mar 17 17:50:11.773323 containerd[1483]: time="2025-03-17T17:50:11.773304686Z" level=info msg="TearDown network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" successfully" Mar 17 17:50:11.773323 containerd[1483]: time="2025-03-17T17:50:11.773319806Z" level=info msg="StopPodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" returns successfully" Mar 17 17:50:11.777088 containerd[1483]: time="2025-03-17T17:50:11.777035463Z" level=info msg="RemovePodSandbox for \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" Mar 17 17:50:11.777088 containerd[1483]: time="2025-03-17T17:50:11.777088542Z" level=info msg="Forcibly stopping sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\"" Mar 17 17:50:11.777244 containerd[1483]: time="2025-03-17T17:50:11.777214341Z" level=info msg="TearDown network for sandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" successfully" Mar 17 17:50:11.782213 containerd[1483]: time="2025-03-17T17:50:11.782150750Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.782329 containerd[1483]: time="2025-03-17T17:50:11.782226310Z" level=info msg="RemovePodSandbox \"2ca26f2d412d39c35fec69702a4d8beba3d7aecfeece45a0edccfa6275b36dd0\" returns successfully" Mar 17 17:50:11.783192 containerd[1483]: time="2025-03-17T17:50:11.782666307Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\"" Mar 17 17:50:11.783192 containerd[1483]: time="2025-03-17T17:50:11.782765986Z" level=info msg="TearDown network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" successfully" Mar 17 17:50:11.783192 containerd[1483]: time="2025-03-17T17:50:11.782776026Z" level=info msg="StopPodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" returns successfully" Mar 17 17:50:11.783365 containerd[1483]: time="2025-03-17T17:50:11.783234743Z" level=info msg="RemovePodSandbox for \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\"" Mar 17 17:50:11.783365 containerd[1483]: time="2025-03-17T17:50:11.783259103Z" level=info msg="Forcibly stopping sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\"" Mar 17 17:50:11.783365 containerd[1483]: time="2025-03-17T17:50:11.783323703Z" level=info msg="TearDown network for sandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" successfully" Mar 17 17:50:11.786527 containerd[1483]: time="2025-03-17T17:50:11.786495442Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.786606 containerd[1483]: time="2025-03-17T17:50:11.786550922Z" level=info msg="RemovePodSandbox \"b485f95af3a50c5fcfa091a69bbb9b84476068f45ba11e53eeed349dbb923ce9\" returns successfully" Mar 17 17:50:11.787026 containerd[1483]: time="2025-03-17T17:50:11.786995399Z" level=info msg="StopPodSandbox for \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\"" Mar 17 17:50:11.787154 containerd[1483]: time="2025-03-17T17:50:11.787107799Z" level=info msg="TearDown network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" successfully" Mar 17 17:50:11.787154 containerd[1483]: time="2025-03-17T17:50:11.787123318Z" level=info msg="StopPodSandbox for \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" returns successfully" Mar 17 17:50:11.787839 containerd[1483]: time="2025-03-17T17:50:11.787687315Z" level=info msg="RemovePodSandbox for \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\"" Mar 17 17:50:11.787839 containerd[1483]: time="2025-03-17T17:50:11.787711995Z" level=info msg="Forcibly stopping sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\"" Mar 17 17:50:11.787839 containerd[1483]: time="2025-03-17T17:50:11.787808914Z" level=info msg="TearDown network for sandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" successfully" Mar 17 17:50:11.791056 containerd[1483]: time="2025-03-17T17:50:11.791010014Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.791119 containerd[1483]: time="2025-03-17T17:50:11.791073853Z" level=info msg="RemovePodSandbox \"6d268ed8a8500e07d60905cedf58467b599a99051b9d1c752c93c9d13069b47a\" returns successfully" Mar 17 17:50:11.791618 containerd[1483]: time="2025-03-17T17:50:11.791593370Z" level=info msg="StopPodSandbox for \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\"" Mar 17 17:50:11.791705 containerd[1483]: time="2025-03-17T17:50:11.791690490Z" level=info msg="TearDown network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\" successfully" Mar 17 17:50:11.791746 containerd[1483]: time="2025-03-17T17:50:11.791704249Z" level=info msg="StopPodSandbox for \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\" returns successfully" Mar 17 17:50:11.792043 containerd[1483]: time="2025-03-17T17:50:11.792019687Z" level=info msg="RemovePodSandbox for \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\"" Mar 17 17:50:11.792076 containerd[1483]: time="2025-03-17T17:50:11.792051007Z" level=info msg="Forcibly stopping sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\"" Mar 17 17:50:11.792136 containerd[1483]: time="2025-03-17T17:50:11.792116487Z" level=info msg="TearDown network for sandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\" successfully" Mar 17 17:50:11.794832 containerd[1483]: time="2025-03-17T17:50:11.794795910Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.794979 containerd[1483]: time="2025-03-17T17:50:11.794855789Z" level=info msg="RemovePodSandbox \"13df38814b5f3e1fbf7bc786fd42294526fb8c029fa320c14eb7f6d827e77b53\" returns successfully" Mar 17 17:50:11.795330 containerd[1483]: time="2025-03-17T17:50:11.795304187Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" Mar 17 17:50:11.795416 containerd[1483]: time="2025-03-17T17:50:11.795398266Z" level=info msg="TearDown network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" successfully" Mar 17 17:50:11.795463 containerd[1483]: time="2025-03-17T17:50:11.795414386Z" level=info msg="StopPodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" returns successfully" Mar 17 17:50:11.795669 containerd[1483]: time="2025-03-17T17:50:11.795645384Z" level=info msg="RemovePodSandbox for \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" Mar 17 17:50:11.795698 containerd[1483]: time="2025-03-17T17:50:11.795674824Z" level=info msg="Forcibly stopping sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\"" Mar 17 17:50:11.795755 containerd[1483]: time="2025-03-17T17:50:11.795740504Z" level=info msg="TearDown network for sandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" successfully" Mar 17 17:50:11.798951 containerd[1483]: time="2025-03-17T17:50:11.798913284Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.799014 containerd[1483]: time="2025-03-17T17:50:11.798977883Z" level=info msg="RemovePodSandbox \"4fa6b30c4a8b8eb8dab626a570ebdb518fac8616cc3b6f2efafe2933c2204e60\" returns successfully" Mar 17 17:50:11.799411 containerd[1483]: time="2025-03-17T17:50:11.799383321Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\"" Mar 17 17:50:11.799513 containerd[1483]: time="2025-03-17T17:50:11.799495440Z" level=info msg="TearDown network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" successfully" Mar 17 17:50:11.799513 containerd[1483]: time="2025-03-17T17:50:11.799511800Z" level=info msg="StopPodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" returns successfully" Mar 17 17:50:11.801217 containerd[1483]: time="2025-03-17T17:50:11.799816758Z" level=info msg="RemovePodSandbox for \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\"" Mar 17 17:50:11.801217 containerd[1483]: time="2025-03-17T17:50:11.799849878Z" level=info msg="Forcibly stopping sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\"" Mar 17 17:50:11.801217 containerd[1483]: time="2025-03-17T17:50:11.799945637Z" level=info msg="TearDown network for sandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" successfully" Mar 17 17:50:11.802690 containerd[1483]: time="2025-03-17T17:50:11.802648980Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.802818 containerd[1483]: time="2025-03-17T17:50:11.802800379Z" level=info msg="RemovePodSandbox \"1b778ea688710c7d4559759be1f78b75fdf2b4290a00c6d87c10f7f91747fb8b\" returns successfully" Mar 17 17:50:11.803324 containerd[1483]: time="2025-03-17T17:50:11.803262536Z" level=info msg="StopPodSandbox for \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\"" Mar 17 17:50:11.803415 containerd[1483]: time="2025-03-17T17:50:11.803390375Z" level=info msg="TearDown network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" successfully" Mar 17 17:50:11.803415 containerd[1483]: time="2025-03-17T17:50:11.803407655Z" level=info msg="StopPodSandbox for \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" returns successfully" Mar 17 17:50:11.803744 containerd[1483]: time="2025-03-17T17:50:11.803709053Z" level=info msg="RemovePodSandbox for \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\"" Mar 17 17:50:11.803744 containerd[1483]: time="2025-03-17T17:50:11.803741933Z" level=info msg="Forcibly stopping sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\"" Mar 17 17:50:11.803843 containerd[1483]: time="2025-03-17T17:50:11.803808093Z" level=info msg="TearDown network for sandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" successfully" Mar 17 17:50:11.806449 containerd[1483]: time="2025-03-17T17:50:11.806416476Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.806507 containerd[1483]: time="2025-03-17T17:50:11.806484716Z" level=info msg="RemovePodSandbox \"2c93d4d60247a914f217fe31ff46929d0dd8e4d50b692d4e32b7b66031ed86b5\" returns successfully" Mar 17 17:50:11.807091 containerd[1483]: time="2025-03-17T17:50:11.806871353Z" level=info msg="StopPodSandbox for \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\"" Mar 17 17:50:11.807091 containerd[1483]: time="2025-03-17T17:50:11.806976752Z" level=info msg="TearDown network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\" successfully" Mar 17 17:50:11.807091 containerd[1483]: time="2025-03-17T17:50:11.806988232Z" level=info msg="StopPodSandbox for \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\" returns successfully" Mar 17 17:50:11.808985 containerd[1483]: time="2025-03-17T17:50:11.807584749Z" level=info msg="RemovePodSandbox for \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\"" Mar 17 17:50:11.808985 containerd[1483]: time="2025-03-17T17:50:11.807620428Z" level=info msg="Forcibly stopping sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\"" Mar 17 17:50:11.808985 containerd[1483]: time="2025-03-17T17:50:11.807695188Z" level=info msg="TearDown network for sandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\" successfully" Mar 17 17:50:11.810726 containerd[1483]: time="2025-03-17T17:50:11.810689889Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.810884 containerd[1483]: time="2025-03-17T17:50:11.810863208Z" level=info msg="RemovePodSandbox \"88ed5b8fa3b224fa3fbc713ca364849daa06ba842efe12533ac1f9a6c4283b9f\" returns successfully" Mar 17 17:50:11.811473 containerd[1483]: time="2025-03-17T17:50:11.811434524Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" Mar 17 17:50:11.811764 containerd[1483]: time="2025-03-17T17:50:11.811547563Z" level=info msg="TearDown network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" successfully" Mar 17 17:50:11.811764 containerd[1483]: time="2025-03-17T17:50:11.811562763Z" level=info msg="StopPodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" returns successfully" Mar 17 17:50:11.811836 containerd[1483]: time="2025-03-17T17:50:11.811783682Z" level=info msg="RemovePodSandbox for \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" Mar 17 17:50:11.811836 containerd[1483]: time="2025-03-17T17:50:11.811804682Z" level=info msg="Forcibly stopping sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\"" Mar 17 17:50:11.811883 containerd[1483]: time="2025-03-17T17:50:11.811857521Z" level=info msg="TearDown network for sandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" successfully" Mar 17 17:50:11.814153 containerd[1483]: time="2025-03-17T17:50:11.814118947Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.814228 containerd[1483]: time="2025-03-17T17:50:11.814186507Z" level=info msg="RemovePodSandbox \"bc22e09341122207cd47f6e31bf566e7bd0804de41b3a1358c60c46376196462\" returns successfully" Mar 17 17:50:11.814805 containerd[1483]: time="2025-03-17T17:50:11.814626464Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\"" Mar 17 17:50:11.814805 containerd[1483]: time="2025-03-17T17:50:11.814732223Z" level=info msg="TearDown network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" successfully" Mar 17 17:50:11.814805 containerd[1483]: time="2025-03-17T17:50:11.814742183Z" level=info msg="StopPodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" returns successfully" Mar 17 17:50:11.815297 containerd[1483]: time="2025-03-17T17:50:11.815269980Z" level=info msg="RemovePodSandbox for \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\"" Mar 17 17:50:11.815297 containerd[1483]: time="2025-03-17T17:50:11.815297860Z" level=info msg="Forcibly stopping sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\"" Mar 17 17:50:11.815377 containerd[1483]: time="2025-03-17T17:50:11.815364419Z" level=info msg="TearDown network for sandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" successfully" Mar 17 17:50:11.817891 containerd[1483]: time="2025-03-17T17:50:11.817852683Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.817990 containerd[1483]: time="2025-03-17T17:50:11.817955323Z" level=info msg="RemovePodSandbox \"0d6840bc996a634737e9cbfb276847b5cda2929c710ec600925510eecb09e64e\" returns successfully" Mar 17 17:50:11.818590 containerd[1483]: time="2025-03-17T17:50:11.818406720Z" level=info msg="StopPodSandbox for \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\"" Mar 17 17:50:11.818590 containerd[1483]: time="2025-03-17T17:50:11.818511159Z" level=info msg="TearDown network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" successfully" Mar 17 17:50:11.818590 containerd[1483]: time="2025-03-17T17:50:11.818522959Z" level=info msg="StopPodSandbox for \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" returns successfully" Mar 17 17:50:11.819039 containerd[1483]: time="2025-03-17T17:50:11.818978516Z" level=info msg="RemovePodSandbox for \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\"" Mar 17 17:50:11.819039 containerd[1483]: time="2025-03-17T17:50:11.819008356Z" level=info msg="Forcibly stopping sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\"" Mar 17 17:50:11.819196 containerd[1483]: time="2025-03-17T17:50:11.819082636Z" level=info msg="TearDown network for sandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" successfully" Mar 17 17:50:11.828008 containerd[1483]: time="2025-03-17T17:50:11.827943459Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.828104 containerd[1483]: time="2025-03-17T17:50:11.828025379Z" level=info msg="RemovePodSandbox \"0f61b133dd973499165da8a8a75aa3c4ad9eff898f23f7312770100bf54cd246\" returns successfully" Mar 17 17:50:11.828785 containerd[1483]: time="2025-03-17T17:50:11.828608015Z" level=info msg="StopPodSandbox for \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\"" Mar 17 17:50:11.828785 containerd[1483]: time="2025-03-17T17:50:11.828710015Z" level=info msg="TearDown network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\" successfully" Mar 17 17:50:11.828785 containerd[1483]: time="2025-03-17T17:50:11.828718854Z" level=info msg="StopPodSandbox for \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\" returns successfully" Mar 17 17:50:11.829252 containerd[1483]: time="2025-03-17T17:50:11.829227531Z" level=info msg="RemovePodSandbox for \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\"" Mar 17 17:50:11.829332 containerd[1483]: time="2025-03-17T17:50:11.829254891Z" level=info msg="Forcibly stopping sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\"" Mar 17 17:50:11.829332 containerd[1483]: time="2025-03-17T17:50:11.829320291Z" level=info msg="TearDown network for sandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\" successfully" Mar 17 17:50:11.839239 containerd[1483]: time="2025-03-17T17:50:11.839187668Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.839339 containerd[1483]: time="2025-03-17T17:50:11.839268907Z" level=info msg="RemovePodSandbox \"0d12bb851bdbba63f753a7298498a305baff7390ffd642cca922dc21d3ba929e\" returns successfully" Mar 17 17:50:11.839994 containerd[1483]: time="2025-03-17T17:50:11.839963103Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" Mar 17 17:50:11.840092 containerd[1483]: time="2025-03-17T17:50:11.840071582Z" level=info msg="TearDown network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" successfully" Mar 17 17:50:11.840092 containerd[1483]: time="2025-03-17T17:50:11.840086182Z" level=info msg="StopPodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" returns successfully" Mar 17 17:50:11.840432 containerd[1483]: time="2025-03-17T17:50:11.840404380Z" level=info msg="RemovePodSandbox for \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" Mar 17 17:50:11.840432 containerd[1483]: time="2025-03-17T17:50:11.840433340Z" level=info msg="Forcibly stopping sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\"" Mar 17 17:50:11.840514 containerd[1483]: time="2025-03-17T17:50:11.840497500Z" level=info msg="TearDown network for sandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" successfully" Mar 17 17:50:11.845193 containerd[1483]: time="2025-03-17T17:50:11.845131510Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.845275 containerd[1483]: time="2025-03-17T17:50:11.845208310Z" level=info msg="RemovePodSandbox \"ecebb527898b774d900534643b439af6fc5b98de624478e45505c849c74e3f0c\" returns successfully" Mar 17 17:50:11.845715 containerd[1483]: time="2025-03-17T17:50:11.845682027Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\"" Mar 17 17:50:11.845809 containerd[1483]: time="2025-03-17T17:50:11.845782906Z" level=info msg="TearDown network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" successfully" Mar 17 17:50:11.845809 containerd[1483]: time="2025-03-17T17:50:11.845798426Z" level=info msg="StopPodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" returns successfully" Mar 17 17:50:11.846175 containerd[1483]: time="2025-03-17T17:50:11.846145024Z" level=info msg="RemovePodSandbox for \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\"" Mar 17 17:50:11.846226 containerd[1483]: time="2025-03-17T17:50:11.846180184Z" level=info msg="Forcibly stopping sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\"" Mar 17 17:50:11.846267 containerd[1483]: time="2025-03-17T17:50:11.846252103Z" level=info msg="TearDown network for sandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" successfully" Mar 17 17:50:11.848954 containerd[1483]: time="2025-03-17T17:50:11.848915686Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.849048 containerd[1483]: time="2025-03-17T17:50:11.848978286Z" level=info msg="RemovePodSandbox \"226119407bc3703f61e9be159a67a6f5cf625c3c6bada9d0ac09e79dbb89e615\" returns successfully" Mar 17 17:50:11.849528 containerd[1483]: time="2025-03-17T17:50:11.849387883Z" level=info msg="StopPodSandbox for \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\"" Mar 17 17:50:11.849528 containerd[1483]: time="2025-03-17T17:50:11.849483443Z" level=info msg="TearDown network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" successfully" Mar 17 17:50:11.849528 containerd[1483]: time="2025-03-17T17:50:11.849492843Z" level=info msg="StopPodSandbox for \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" returns successfully" Mar 17 17:50:11.850022 containerd[1483]: time="2025-03-17T17:50:11.849934040Z" level=info msg="RemovePodSandbox for \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\"" Mar 17 17:50:11.850022 containerd[1483]: time="2025-03-17T17:50:11.849960560Z" level=info msg="Forcibly stopping sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\"" Mar 17 17:50:11.850022 containerd[1483]: time="2025-03-17T17:50:11.850016559Z" level=info msg="TearDown network for sandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" successfully" Mar 17 17:50:11.861625 containerd[1483]: time="2025-03-17T17:50:11.861560486Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.861625 containerd[1483]: time="2025-03-17T17:50:11.861627726Z" level=info msg="RemovePodSandbox \"e878fb48306a9f3eaf2ece19db93d88bc1a0e50c1df342228962223e436a9fe0\" returns successfully" Mar 17 17:50:11.862191 containerd[1483]: time="2025-03-17T17:50:11.862144522Z" level=info msg="StopPodSandbox for \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\"" Mar 17 17:50:11.862285 containerd[1483]: time="2025-03-17T17:50:11.862259962Z" level=info msg="TearDown network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\" successfully" Mar 17 17:50:11.862285 containerd[1483]: time="2025-03-17T17:50:11.862274081Z" level=info msg="StopPodSandbox for \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\" returns successfully" Mar 17 17:50:11.862740 containerd[1483]: time="2025-03-17T17:50:11.862718319Z" level=info msg="RemovePodSandbox for \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\"" Mar 17 17:50:11.862789 containerd[1483]: time="2025-03-17T17:50:11.862744158Z" level=info msg="Forcibly stopping sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\"" Mar 17 17:50:11.862828 containerd[1483]: time="2025-03-17T17:50:11.862814358Z" level=info msg="TearDown network for sandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\" successfully" Mar 17 17:50:11.872141 containerd[1483]: time="2025-03-17T17:50:11.871960580Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.872141 containerd[1483]: time="2025-03-17T17:50:11.872039459Z" level=info msg="RemovePodSandbox \"50cdc9a2b8d6cf4f464069553c8e8640c7ee5e163649e6c5ad67dfddcdc1a211\" returns successfully" Mar 17 17:50:11.872786 containerd[1483]: time="2025-03-17T17:50:11.872491337Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" Mar 17 17:50:11.872786 containerd[1483]: time="2025-03-17T17:50:11.872599616Z" level=info msg="TearDown network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" successfully" Mar 17 17:50:11.872786 containerd[1483]: time="2025-03-17T17:50:11.872610816Z" level=info msg="StopPodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" returns successfully" Mar 17 17:50:11.873054 containerd[1483]: time="2025-03-17T17:50:11.873027813Z" level=info msg="RemovePodSandbox for \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" Mar 17 17:50:11.873136 containerd[1483]: time="2025-03-17T17:50:11.873114573Z" level=info msg="Forcibly stopping sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\"" Mar 17 17:50:11.873255 containerd[1483]: time="2025-03-17T17:50:11.873240012Z" level=info msg="TearDown network for sandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" successfully" Mar 17 17:50:11.876174 containerd[1483]: time="2025-03-17T17:50:11.876131434Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.876582 containerd[1483]: time="2025-03-17T17:50:11.876316152Z" level=info msg="RemovePodSandbox \"a235f67b1271158937992bd58a81f1798b9f432940006d7ee71ea87d11c79eec\" returns successfully" Mar 17 17:50:11.876774 containerd[1483]: time="2025-03-17T17:50:11.876740310Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\"" Mar 17 17:50:11.876861 containerd[1483]: time="2025-03-17T17:50:11.876844869Z" level=info msg="TearDown network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" successfully" Mar 17 17:50:11.876886 containerd[1483]: time="2025-03-17T17:50:11.876859469Z" level=info msg="StopPodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" returns successfully" Mar 17 17:50:11.877491 containerd[1483]: time="2025-03-17T17:50:11.877254986Z" level=info msg="RemovePodSandbox for \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\"" Mar 17 17:50:11.877491 containerd[1483]: time="2025-03-17T17:50:11.877282466Z" level=info msg="Forcibly stopping sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\"" Mar 17 17:50:11.877491 containerd[1483]: time="2025-03-17T17:50:11.877346066Z" level=info msg="TearDown network for sandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" successfully" Mar 17 17:50:11.880306 containerd[1483]: time="2025-03-17T17:50:11.880260247Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.880432 containerd[1483]: time="2025-03-17T17:50:11.880415766Z" level=info msg="RemovePodSandbox \"2694779d599c93132ad5dac34a2670b807596ea6fe52fad5e73d469f8449460a\" returns successfully" Mar 17 17:50:11.880936 containerd[1483]: time="2025-03-17T17:50:11.880875163Z" level=info msg="StopPodSandbox for \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\"" Mar 17 17:50:11.881028 containerd[1483]: time="2025-03-17T17:50:11.881000803Z" level=info msg="TearDown network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" successfully" Mar 17 17:50:11.881028 containerd[1483]: time="2025-03-17T17:50:11.881019082Z" level=info msg="StopPodSandbox for \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" returns successfully" Mar 17 17:50:11.881515 containerd[1483]: time="2025-03-17T17:50:11.881492199Z" level=info msg="RemovePodSandbox for \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\"" Mar 17 17:50:11.881558 containerd[1483]: time="2025-03-17T17:50:11.881522919Z" level=info msg="Forcibly stopping sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\"" Mar 17 17:50:11.881603 containerd[1483]: time="2025-03-17T17:50:11.881588919Z" level=info msg="TearDown network for sandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" successfully" Mar 17 17:50:11.884525 containerd[1483]: time="2025-03-17T17:50:11.884487420Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.884585 containerd[1483]: time="2025-03-17T17:50:11.884557420Z" level=info msg="RemovePodSandbox \"39cddc503b38b83b7cfc62b5f34a578c306b25776b4501caa4a8cb126e2c93e6\" returns successfully" Mar 17 17:50:11.885246 containerd[1483]: time="2025-03-17T17:50:11.885048657Z" level=info msg="StopPodSandbox for \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\"" Mar 17 17:50:11.885246 containerd[1483]: time="2025-03-17T17:50:11.885145736Z" level=info msg="TearDown network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\" successfully" Mar 17 17:50:11.885246 containerd[1483]: time="2025-03-17T17:50:11.885186056Z" level=info msg="StopPodSandbox for \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\" returns successfully" Mar 17 17:50:11.885504 containerd[1483]: time="2025-03-17T17:50:11.885462534Z" level=info msg="RemovePodSandbox for \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\"" Mar 17 17:50:11.885504 containerd[1483]: time="2025-03-17T17:50:11.885491894Z" level=info msg="Forcibly stopping sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\"" Mar 17 17:50:11.885584 containerd[1483]: time="2025-03-17T17:50:11.885568734Z" level=info msg="TearDown network for sandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\" successfully" Mar 17 17:50:11.888325 containerd[1483]: time="2025-03-17T17:50:11.888290836Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 17 17:50:11.888418 containerd[1483]: time="2025-03-17T17:50:11.888349956Z" level=info msg="RemovePodSandbox \"b9478339c7bbf2dafdd56f4acd66cf18038d60f2870b1c9e872d33f6449f937d\" returns successfully" Mar 17 17:50:15.198235 kubelet[2562]: I0317 17:50:15.197935 2562 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 17:50:15.277446 systemd[1]: Started sshd@19-10.0.0.115:22-10.0.0.1:45172.service - OpenSSH per-connection server daemon (10.0.0.1:45172). Mar 17 17:50:15.324230 sshd[5632]: Accepted publickey for core from 10.0.0.1 port 45172 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:15.325736 sshd-session[5632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:15.330535 systemd-logind[1469]: New session 20 of user core. Mar 17 17:50:15.338160 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 17 17:50:15.475453 sshd[5634]: Connection closed by 10.0.0.1 port 45172 Mar 17 17:50:15.476522 sshd-session[5632]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:15.480113 systemd[1]: sshd@19-10.0.0.115:22-10.0.0.1:45172.service: Deactivated successfully. Mar 17 17:50:15.482635 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 17:50:15.483477 systemd-logind[1469]: Session 20 logged out. Waiting for processes to exit. Mar 17 17:50:15.484450 systemd-logind[1469]: Removed session 20. Mar 17 17:50:20.492494 systemd[1]: Started sshd@20-10.0.0.115:22-10.0.0.1:45176.service - OpenSSH per-connection server daemon (10.0.0.1:45176). Mar 17 17:50:20.545438 sshd[5682]: Accepted publickey for core from 10.0.0.1 port 45176 ssh2: RSA SHA256:5Ue/V+RoCRMkcnXRZmyQndEQOSMEwJs2XNBwCapeMHg Mar 17 17:50:20.547009 sshd-session[5682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 17 17:50:20.551619 systemd-logind[1469]: New session 21 of user core. Mar 17 17:50:20.558045 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 17 17:50:20.705935 sshd[5684]: Connection closed by 10.0.0.1 port 45176 Mar 17 17:50:20.706328 sshd-session[5682]: pam_unix(sshd:session): session closed for user core Mar 17 17:50:20.709127 systemd-logind[1469]: Session 21 logged out. Waiting for processes to exit. Mar 17 17:50:20.709477 systemd[1]: sshd@20-10.0.0.115:22-10.0.0.1:45176.service: Deactivated successfully. Mar 17 17:50:20.714217 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 17:50:20.717941 systemd-logind[1469]: Removed session 21.