Mar 21 12:32:46.899181 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 21 12:32:46.899202 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Fri Mar 21 10:53:54 -00 2025 Mar 21 12:32:46.899211 kernel: KASLR enabled Mar 21 12:32:46.899264 kernel: efi: EFI v2.7 by EDK II Mar 21 12:32:46.899271 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdbbae018 ACPI 2.0=0xd9b43018 RNG=0xd9b43a18 MEMRESERVE=0xd9b40218 Mar 21 12:32:46.899276 kernel: random: crng init done Mar 21 12:32:46.899282 kernel: secureboot: Secure boot disabled Mar 21 12:32:46.899288 kernel: ACPI: Early table checksum verification disabled Mar 21 12:32:46.899294 kernel: ACPI: RSDP 0x00000000D9B43018 000024 (v02 BOCHS ) Mar 21 12:32:46.899302 kernel: ACPI: XSDT 0x00000000D9B43F18 000064 (v01 BOCHS BXPC 00000001 01000013) Mar 21 12:32:46.899308 kernel: ACPI: FACP 0x00000000D9B43B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899314 kernel: ACPI: DSDT 0x00000000D9B41018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899319 kernel: ACPI: APIC 0x00000000D9B43C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899325 kernel: ACPI: PPTT 0x00000000D9B43098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899332 kernel: ACPI: GTDT 0x00000000D9B43818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899339 kernel: ACPI: MCFG 0x00000000D9B43A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899345 kernel: ACPI: SPCR 0x00000000D9B43918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899351 kernel: ACPI: DBG2 0x00000000D9B43998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899357 kernel: ACPI: IORT 0x00000000D9B43198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 12:32:46.899362 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Mar 21 12:32:46.899368 kernel: NUMA: Failed to initialise from firmware Mar 21 12:32:46.899374 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Mar 21 12:32:46.899380 kernel: NUMA: NODE_DATA [mem 0xdc95a800-0xdc95ffff] Mar 21 12:32:46.899386 kernel: Zone ranges: Mar 21 12:32:46.899392 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Mar 21 12:32:46.899399 kernel: DMA32 empty Mar 21 12:32:46.899405 kernel: Normal empty Mar 21 12:32:46.899410 kernel: Movable zone start for each node Mar 21 12:32:46.899416 kernel: Early memory node ranges Mar 21 12:32:46.899422 kernel: node 0: [mem 0x0000000040000000-0x00000000d967ffff] Mar 21 12:32:46.899428 kernel: node 0: [mem 0x00000000d9680000-0x00000000d968ffff] Mar 21 12:32:46.899434 kernel: node 0: [mem 0x00000000d9690000-0x00000000d976ffff] Mar 21 12:32:46.899440 kernel: node 0: [mem 0x00000000d9770000-0x00000000d9b3ffff] Mar 21 12:32:46.899445 kernel: node 0: [mem 0x00000000d9b40000-0x00000000dce1ffff] Mar 21 12:32:46.899451 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Mar 21 12:32:46.899457 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Mar 21 12:32:46.899463 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Mar 21 12:32:46.899470 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Mar 21 12:32:46.899476 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Mar 21 12:32:46.899482 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Mar 21 12:32:46.899490 kernel: psci: probing for conduit method from ACPI. Mar 21 12:32:46.899497 kernel: psci: PSCIv1.1 detected in firmware. Mar 21 12:32:46.899503 kernel: psci: Using standard PSCI v0.2 function IDs Mar 21 12:32:46.899510 kernel: psci: Trusted OS migration not required Mar 21 12:32:46.899516 kernel: psci: SMC Calling Convention v1.1 Mar 21 12:32:46.899523 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 21 12:32:46.899529 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 21 12:32:46.899536 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 21 12:32:46.899542 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 21 12:32:46.899548 kernel: Detected PIPT I-cache on CPU0 Mar 21 12:32:46.899555 kernel: CPU features: detected: GIC system register CPU interface Mar 21 12:32:46.899561 kernel: CPU features: detected: Hardware dirty bit management Mar 21 12:32:46.899567 kernel: CPU features: detected: Spectre-v4 Mar 21 12:32:46.899574 kernel: CPU features: detected: Spectre-BHB Mar 21 12:32:46.899581 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 21 12:32:46.899587 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 21 12:32:46.899593 kernel: CPU features: detected: ARM erratum 1418040 Mar 21 12:32:46.899600 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 21 12:32:46.899606 kernel: alternatives: applying boot alternatives Mar 21 12:32:46.899614 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=93cb17f03b776356c0810b716fff0c7c2012572bbe395c702f6873d17674684f Mar 21 12:32:46.899620 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 21 12:32:46.899627 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 21 12:32:46.899633 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 21 12:32:46.899639 kernel: Fallback order for Node 0: 0 Mar 21 12:32:46.899647 kernel: Built 1 zonelists, mobility grouping on. Total pages: 633024 Mar 21 12:32:46.899653 kernel: Policy zone: DMA Mar 21 12:32:46.899659 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 21 12:32:46.899666 kernel: software IO TLB: area num 4. Mar 21 12:32:46.899672 kernel: software IO TLB: mapped [mem 0x00000000d2e00000-0x00000000d6e00000] (64MB) Mar 21 12:32:46.899679 kernel: Memory: 2387420K/2572288K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38464K init, 897K bss, 184868K reserved, 0K cma-reserved) Mar 21 12:32:46.899685 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 21 12:32:46.899691 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 21 12:32:46.899698 kernel: rcu: RCU event tracing is enabled. Mar 21 12:32:46.899705 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 21 12:32:46.899712 kernel: Trampoline variant of Tasks RCU enabled. Mar 21 12:32:46.899718 kernel: Tracing variant of Tasks RCU enabled. Mar 21 12:32:46.899726 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 21 12:32:46.899732 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 21 12:32:46.899738 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 21 12:32:46.899744 kernel: GICv3: 256 SPIs implemented Mar 21 12:32:46.899751 kernel: GICv3: 0 Extended SPIs implemented Mar 21 12:32:46.899757 kernel: Root IRQ handler: gic_handle_irq Mar 21 12:32:46.899763 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 21 12:32:46.899769 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 21 12:32:46.899776 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 21 12:32:46.899782 kernel: ITS@0x0000000008080000: allocated 8192 Devices @400c0000 (indirect, esz 8, psz 64K, shr 1) Mar 21 12:32:46.899789 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @400d0000 (flat, esz 8, psz 64K, shr 1) Mar 21 12:32:46.899796 kernel: GICv3: using LPI property table @0x00000000400f0000 Mar 21 12:32:46.899803 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040100000 Mar 21 12:32:46.899809 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 21 12:32:46.899815 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 21 12:32:46.899822 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 21 12:32:46.899828 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 21 12:32:46.899842 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 21 12:32:46.899848 kernel: arm-pv: using stolen time PV Mar 21 12:32:46.899855 kernel: Console: colour dummy device 80x25 Mar 21 12:32:46.899861 kernel: ACPI: Core revision 20230628 Mar 21 12:32:46.899868 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 21 12:32:46.899876 kernel: pid_max: default: 32768 minimum: 301 Mar 21 12:32:46.899883 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 21 12:32:46.899890 kernel: landlock: Up and running. Mar 21 12:32:46.899896 kernel: SELinux: Initializing. Mar 21 12:32:46.899902 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 21 12:32:46.899909 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 21 12:32:46.899915 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 21 12:32:46.899922 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 21 12:32:46.899929 kernel: rcu: Hierarchical SRCU implementation. Mar 21 12:32:46.899936 kernel: rcu: Max phase no-delay instances is 400. Mar 21 12:32:46.899943 kernel: Platform MSI: ITS@0x8080000 domain created Mar 21 12:32:46.899949 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 21 12:32:46.899956 kernel: Remapping and enabling EFI services. Mar 21 12:32:46.899962 kernel: smp: Bringing up secondary CPUs ... Mar 21 12:32:46.899968 kernel: Detected PIPT I-cache on CPU1 Mar 21 12:32:46.899975 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 21 12:32:46.899982 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040110000 Mar 21 12:32:46.899989 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 21 12:32:46.899996 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 21 12:32:46.900003 kernel: Detected PIPT I-cache on CPU2 Mar 21 12:32:46.900014 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 21 12:32:46.900022 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040120000 Mar 21 12:32:46.900028 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 21 12:32:46.900035 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 21 12:32:46.900042 kernel: Detected PIPT I-cache on CPU3 Mar 21 12:32:46.900048 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 21 12:32:46.900056 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040130000 Mar 21 12:32:46.900063 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 21 12:32:46.900070 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 21 12:32:46.900077 kernel: smp: Brought up 1 node, 4 CPUs Mar 21 12:32:46.900083 kernel: SMP: Total of 4 processors activated. Mar 21 12:32:46.900090 kernel: CPU features: detected: 32-bit EL0 Support Mar 21 12:32:46.900097 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 21 12:32:46.900104 kernel: CPU features: detected: Common not Private translations Mar 21 12:32:46.900111 kernel: CPU features: detected: CRC32 instructions Mar 21 12:32:46.900118 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 21 12:32:46.900125 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 21 12:32:46.900132 kernel: CPU features: detected: LSE atomic instructions Mar 21 12:32:46.900139 kernel: CPU features: detected: Privileged Access Never Mar 21 12:32:46.900146 kernel: CPU features: detected: RAS Extension Support Mar 21 12:32:46.900152 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 21 12:32:46.900159 kernel: CPU: All CPU(s) started at EL1 Mar 21 12:32:46.900166 kernel: alternatives: applying system-wide alternatives Mar 21 12:32:46.900172 kernel: devtmpfs: initialized Mar 21 12:32:46.900180 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 21 12:32:46.900188 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 21 12:32:46.900195 kernel: pinctrl core: initialized pinctrl subsystem Mar 21 12:32:46.900201 kernel: SMBIOS 3.0.0 present. Mar 21 12:32:46.900208 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Mar 21 12:32:46.900220 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 21 12:32:46.900228 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 21 12:32:46.900235 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 21 12:32:46.900242 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 21 12:32:46.900250 kernel: audit: initializing netlink subsys (disabled) Mar 21 12:32:46.900257 kernel: audit: type=2000 audit(0.019:1): state=initialized audit_enabled=0 res=1 Mar 21 12:32:46.900264 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 21 12:32:46.900270 kernel: cpuidle: using governor menu Mar 21 12:32:46.900277 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 21 12:32:46.900284 kernel: ASID allocator initialised with 32768 entries Mar 21 12:32:46.900291 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 21 12:32:46.900297 kernel: Serial: AMBA PL011 UART driver Mar 21 12:32:46.900304 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 21 12:32:46.900312 kernel: Modules: 0 pages in range for non-PLT usage Mar 21 12:32:46.900319 kernel: Modules: 509248 pages in range for PLT usage Mar 21 12:32:46.900326 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 21 12:32:46.900333 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 21 12:32:46.900340 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 21 12:32:46.900347 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 21 12:32:46.900353 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 21 12:32:46.900360 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 21 12:32:46.900367 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 21 12:32:46.900375 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 21 12:32:46.900381 kernel: ACPI: Added _OSI(Module Device) Mar 21 12:32:46.900388 kernel: ACPI: Added _OSI(Processor Device) Mar 21 12:32:46.900395 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 21 12:32:46.900402 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 21 12:32:46.900408 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 21 12:32:46.900415 kernel: ACPI: Interpreter enabled Mar 21 12:32:46.900422 kernel: ACPI: Using GIC for interrupt routing Mar 21 12:32:46.900429 kernel: ACPI: MCFG table detected, 1 entries Mar 21 12:32:46.900436 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 21 12:32:46.900444 kernel: printk: console [ttyAMA0] enabled Mar 21 12:32:46.900451 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 21 12:32:46.900576 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 21 12:32:46.900647 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 21 12:32:46.900710 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 21 12:32:46.900771 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 21 12:32:46.900839 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 21 12:32:46.900852 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 21 12:32:46.900859 kernel: PCI host bridge to bus 0000:00 Mar 21 12:32:46.900932 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 21 12:32:46.900990 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 21 12:32:46.901046 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 21 12:32:46.901102 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 21 12:32:46.901178 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 21 12:32:46.901286 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 Mar 21 12:32:46.901354 kernel: pci 0000:00:01.0: reg 0x10: [io 0x0000-0x001f] Mar 21 12:32:46.901418 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x10000000-0x10000fff] Mar 21 12:32:46.901480 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 21 12:32:46.901544 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 21 12:32:46.901606 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x10000000-0x10000fff] Mar 21 12:32:46.901672 kernel: pci 0000:00:01.0: BAR 0: assigned [io 0x1000-0x101f] Mar 21 12:32:46.901731 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 21 12:32:46.901787 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 21 12:32:46.901850 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 21 12:32:46.901859 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 21 12:32:46.901867 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 21 12:32:46.901873 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 21 12:32:46.901880 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 21 12:32:46.901889 kernel: iommu: Default domain type: Translated Mar 21 12:32:46.901896 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 21 12:32:46.901903 kernel: efivars: Registered efivars operations Mar 21 12:32:46.901910 kernel: vgaarb: loaded Mar 21 12:32:46.901917 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 21 12:32:46.901924 kernel: VFS: Disk quotas dquot_6.6.0 Mar 21 12:32:46.901931 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 21 12:32:46.901938 kernel: pnp: PnP ACPI init Mar 21 12:32:46.902010 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 21 12:32:46.902021 kernel: pnp: PnP ACPI: found 1 devices Mar 21 12:32:46.902029 kernel: NET: Registered PF_INET protocol family Mar 21 12:32:46.902035 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 21 12:32:46.902043 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 21 12:32:46.902050 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 21 12:32:46.902057 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 21 12:32:46.902063 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 21 12:32:46.902071 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 21 12:32:46.902079 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 21 12:32:46.902086 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 21 12:32:46.902093 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 21 12:32:46.902099 kernel: PCI: CLS 0 bytes, default 64 Mar 21 12:32:46.902106 kernel: kvm [1]: HYP mode not available Mar 21 12:32:46.902113 kernel: Initialise system trusted keyrings Mar 21 12:32:46.902120 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 21 12:32:46.902126 kernel: Key type asymmetric registered Mar 21 12:32:46.902133 kernel: Asymmetric key parser 'x509' registered Mar 21 12:32:46.902140 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 21 12:32:46.902148 kernel: io scheduler mq-deadline registered Mar 21 12:32:46.902155 kernel: io scheduler kyber registered Mar 21 12:32:46.902162 kernel: io scheduler bfq registered Mar 21 12:32:46.902169 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 21 12:32:46.902176 kernel: ACPI: button: Power Button [PWRB] Mar 21 12:32:46.902183 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 21 12:32:46.902256 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Mar 21 12:32:46.902266 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 21 12:32:46.902273 kernel: thunder_xcv, ver 1.0 Mar 21 12:32:46.902282 kernel: thunder_bgx, ver 1.0 Mar 21 12:32:46.902288 kernel: nicpf, ver 1.0 Mar 21 12:32:46.902295 kernel: nicvf, ver 1.0 Mar 21 12:32:46.902364 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 21 12:32:46.902424 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-21T12:32:46 UTC (1742560366) Mar 21 12:32:46.902434 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 21 12:32:46.902441 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 21 12:32:46.902448 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 21 12:32:46.902456 kernel: watchdog: Hard watchdog permanently disabled Mar 21 12:32:46.902463 kernel: NET: Registered PF_INET6 protocol family Mar 21 12:32:46.902470 kernel: Segment Routing with IPv6 Mar 21 12:32:46.902477 kernel: In-situ OAM (IOAM) with IPv6 Mar 21 12:32:46.902484 kernel: NET: Registered PF_PACKET protocol family Mar 21 12:32:46.902490 kernel: Key type dns_resolver registered Mar 21 12:32:46.902497 kernel: registered taskstats version 1 Mar 21 12:32:46.902504 kernel: Loading compiled-in X.509 certificates Mar 21 12:32:46.902511 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 5eb113f0b3321dedaccf2566eff1e4f54032526e' Mar 21 12:32:46.902520 kernel: Key type .fscrypt registered Mar 21 12:32:46.902526 kernel: Key type fscrypt-provisioning registered Mar 21 12:32:46.902533 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 21 12:32:46.902540 kernel: ima: Allocated hash algorithm: sha1 Mar 21 12:32:46.902547 kernel: ima: No architecture policies found Mar 21 12:32:46.902554 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 21 12:32:46.902561 kernel: clk: Disabling unused clocks Mar 21 12:32:46.902567 kernel: Freeing unused kernel memory: 38464K Mar 21 12:32:46.902575 kernel: Run /init as init process Mar 21 12:32:46.902582 kernel: with arguments: Mar 21 12:32:46.902589 kernel: /init Mar 21 12:32:46.902595 kernel: with environment: Mar 21 12:32:46.902602 kernel: HOME=/ Mar 21 12:32:46.902608 kernel: TERM=linux Mar 21 12:32:46.902615 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 21 12:32:46.902623 systemd[1]: Successfully made /usr/ read-only. Mar 21 12:32:46.902632 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 21 12:32:46.902642 systemd[1]: Detected virtualization kvm. Mar 21 12:32:46.902649 systemd[1]: Detected architecture arm64. Mar 21 12:32:46.902656 systemd[1]: Running in initrd. Mar 21 12:32:46.902663 systemd[1]: No hostname configured, using default hostname. Mar 21 12:32:46.902671 systemd[1]: Hostname set to . Mar 21 12:32:46.902678 systemd[1]: Initializing machine ID from VM UUID. Mar 21 12:32:46.902685 systemd[1]: Queued start job for default target initrd.target. Mar 21 12:32:46.902694 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 12:32:46.902701 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 12:32:46.902709 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 21 12:32:46.902717 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 21 12:32:46.902724 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 21 12:32:46.902733 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 21 12:32:46.902741 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 21 12:32:46.902749 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 21 12:32:46.902757 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 12:32:46.902764 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 21 12:32:46.902772 systemd[1]: Reached target paths.target - Path Units. Mar 21 12:32:46.902779 systemd[1]: Reached target slices.target - Slice Units. Mar 21 12:32:46.902786 systemd[1]: Reached target swap.target - Swaps. Mar 21 12:32:46.902794 systemd[1]: Reached target timers.target - Timer Units. Mar 21 12:32:46.902801 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 21 12:32:46.902808 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 21 12:32:46.902817 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 21 12:32:46.902825 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 21 12:32:46.902838 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 21 12:32:46.902847 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 21 12:32:46.902854 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 12:32:46.902862 systemd[1]: Reached target sockets.target - Socket Units. Mar 21 12:32:46.902869 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 21 12:32:46.902876 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 21 12:32:46.902885 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 21 12:32:46.902893 systemd[1]: Starting systemd-fsck-usr.service... Mar 21 12:32:46.902900 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 21 12:32:46.902908 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 21 12:32:46.902915 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 12:32:46.902923 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 21 12:32:46.902930 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 12:32:46.902939 systemd[1]: Finished systemd-fsck-usr.service. Mar 21 12:32:46.902947 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 21 12:32:46.902955 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 12:32:46.902963 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 12:32:46.902986 systemd-journald[237]: Collecting audit messages is disabled. Mar 21 12:32:46.903006 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 21 12:32:46.903014 systemd-journald[237]: Journal started Mar 21 12:32:46.903032 systemd-journald[237]: Runtime Journal (/run/log/journal/bc92f46de7404d50a79ac2638a16ee21) is 5.9M, max 47.3M, 41.4M free. Mar 21 12:32:46.911314 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 21 12:32:46.911349 kernel: Bridge firewalling registered Mar 21 12:32:46.888758 systemd-modules-load[238]: Inserted module 'overlay' Mar 21 12:32:46.906415 systemd-modules-load[238]: Inserted module 'br_netfilter' Mar 21 12:32:46.914941 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 21 12:32:46.914958 systemd[1]: Started systemd-journald.service - Journal Service. Mar 21 12:32:46.915914 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 21 12:32:46.920716 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 12:32:46.923146 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 21 12:32:46.925349 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 21 12:32:46.926319 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 12:32:46.928819 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 21 12:32:46.932029 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 21 12:32:46.939488 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 12:32:46.941771 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 21 12:32:46.947925 dracut-cmdline[275]: dracut-dracut-053 Mar 21 12:32:46.957824 dracut-cmdline[275]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=93cb17f03b776356c0810b716fff0c7c2012572bbe395c702f6873d17674684f Mar 21 12:32:46.986336 systemd-resolved[281]: Positive Trust Anchors: Mar 21 12:32:46.986352 systemd-resolved[281]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 21 12:32:46.986383 systemd-resolved[281]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 21 12:32:46.992592 systemd-resolved[281]: Defaulting to hostname 'linux'. Mar 21 12:32:46.993515 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 21 12:32:46.994376 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 21 12:32:47.015243 kernel: SCSI subsystem initialized Mar 21 12:32:47.022240 kernel: Loading iSCSI transport class v2.0-870. Mar 21 12:32:47.030245 kernel: iscsi: registered transport (tcp) Mar 21 12:32:47.041239 kernel: iscsi: registered transport (qla4xxx) Mar 21 12:32:47.041257 kernel: QLogic iSCSI HBA Driver Mar 21 12:32:47.079389 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 21 12:32:47.082369 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 21 12:32:47.109469 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 21 12:32:47.109520 kernel: device-mapper: uevent: version 1.0.3 Mar 21 12:32:47.110286 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 21 12:32:47.156239 kernel: raid6: neonx8 gen() 15736 MB/s Mar 21 12:32:47.173229 kernel: raid6: neonx4 gen() 15763 MB/s Mar 21 12:32:47.190237 kernel: raid6: neonx2 gen() 13176 MB/s Mar 21 12:32:47.207227 kernel: raid6: neonx1 gen() 10466 MB/s Mar 21 12:32:47.224228 kernel: raid6: int64x8 gen() 6773 MB/s Mar 21 12:32:47.241226 kernel: raid6: int64x4 gen() 7328 MB/s Mar 21 12:32:47.258231 kernel: raid6: int64x2 gen() 6095 MB/s Mar 21 12:32:47.275231 kernel: raid6: int64x1 gen() 5037 MB/s Mar 21 12:32:47.275244 kernel: raid6: using algorithm neonx4 gen() 15763 MB/s Mar 21 12:32:47.292236 kernel: raid6: .... xor() 12349 MB/s, rmw enabled Mar 21 12:32:47.292249 kernel: raid6: using neon recovery algorithm Mar 21 12:32:47.297232 kernel: xor: measuring software checksum speed Mar 21 12:32:47.297246 kernel: 8regs : 21613 MB/sec Mar 21 12:32:47.297259 kernel: 32regs : 20083 MB/sec Mar 21 12:32:47.298559 kernel: arm64_neon : 27860 MB/sec Mar 21 12:32:47.298571 kernel: xor: using function: arm64_neon (27860 MB/sec) Mar 21 12:32:47.349238 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 21 12:32:47.359271 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 21 12:32:47.361634 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 12:32:47.386324 systemd-udevd[464]: Using default interface naming scheme 'v255'. Mar 21 12:32:47.389904 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 12:32:47.392337 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 21 12:32:47.413040 dracut-pre-trigger[473]: rd.md=0: removing MD RAID activation Mar 21 12:32:47.436660 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 21 12:32:47.440309 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 21 12:32:47.490239 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 12:32:47.492350 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 21 12:32:47.515326 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 21 12:32:47.516740 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 21 12:32:47.517981 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 12:32:47.519849 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 21 12:32:47.521949 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 21 12:32:47.536242 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Mar 21 12:32:47.552363 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 21 12:32:47.552463 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 21 12:32:47.552475 kernel: GPT:9289727 != 19775487 Mar 21 12:32:47.552484 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 21 12:32:47.552492 kernel: GPT:9289727 != 19775487 Mar 21 12:32:47.552501 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 21 12:32:47.552511 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 12:32:47.544966 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 21 12:32:47.552369 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 21 12:32:47.552471 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 12:32:47.554165 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 12:32:47.555148 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 21 12:32:47.555282 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 12:32:47.558480 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 12:32:47.566435 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 12:32:47.572059 kernel: BTRFS: device fsid bdcda679-e2cc-43ec-88ed-d0a5c8807e76 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (527) Mar 21 12:32:47.572101 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (518) Mar 21 12:32:47.584737 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 21 12:32:47.586904 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 12:32:47.597050 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 21 12:32:47.615193 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 21 12:32:47.616343 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 21 12:32:47.626703 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 21 12:32:47.628582 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 21 12:32:47.630499 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 12:32:47.650852 disk-uuid[555]: Primary Header is updated. Mar 21 12:32:47.650852 disk-uuid[555]: Secondary Entries is updated. Mar 21 12:32:47.650852 disk-uuid[555]: Secondary Header is updated. Mar 21 12:32:47.657243 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 12:32:47.658880 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 12:32:48.664240 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 12:32:48.664776 disk-uuid[556]: The operation has completed successfully. Mar 21 12:32:48.684746 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 21 12:32:48.684846 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 21 12:32:48.720681 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 21 12:32:48.734844 sh[576]: Success Mar 21 12:32:48.750236 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 21 12:32:48.779493 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 21 12:32:48.781671 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 21 12:32:48.793493 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 21 12:32:48.798730 kernel: BTRFS info (device dm-0): first mount of filesystem bdcda679-e2cc-43ec-88ed-d0a5c8807e76 Mar 21 12:32:48.798760 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 21 12:32:48.798770 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 21 12:32:48.798786 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 21 12:32:48.799306 kernel: BTRFS info (device dm-0): using free space tree Mar 21 12:32:48.802860 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 21 12:32:48.804170 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 21 12:32:48.804857 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 21 12:32:48.807663 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 21 12:32:48.825868 kernel: BTRFS info (device vda6): first mount of filesystem fea78075-4b56-496a-88c9-8f4cfa7493bf Mar 21 12:32:48.825901 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 21 12:32:48.825917 kernel: BTRFS info (device vda6): using free space tree Mar 21 12:32:48.829319 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 12:32:48.832238 kernel: BTRFS info (device vda6): last unmount of filesystem fea78075-4b56-496a-88c9-8f4cfa7493bf Mar 21 12:32:48.835848 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 21 12:32:48.837638 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 21 12:32:48.901947 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 21 12:32:48.905452 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 21 12:32:48.930951 ignition[670]: Ignition 2.20.0 Mar 21 12:32:48.930961 ignition[670]: Stage: fetch-offline Mar 21 12:32:48.930990 ignition[670]: no configs at "/usr/lib/ignition/base.d" Mar 21 12:32:48.930998 ignition[670]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 21 12:32:48.931155 ignition[670]: parsed url from cmdline: "" Mar 21 12:32:48.931158 ignition[670]: no config URL provided Mar 21 12:32:48.931162 ignition[670]: reading system config file "/usr/lib/ignition/user.ign" Mar 21 12:32:48.931169 ignition[670]: no config at "/usr/lib/ignition/user.ign" Mar 21 12:32:48.931192 ignition[670]: op(1): [started] loading QEMU firmware config module Mar 21 12:32:48.931196 ignition[670]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 21 12:32:48.940015 systemd-networkd[763]: lo: Link UP Mar 21 12:32:48.940025 systemd-networkd[763]: lo: Gained carrier Mar 21 12:32:48.940867 systemd-networkd[763]: Enumeration completed Mar 21 12:32:48.941356 systemd-networkd[763]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 12:32:48.941360 systemd-networkd[763]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 21 12:32:48.943246 ignition[670]: op(1): [finished] loading QEMU firmware config module Mar 21 12:32:48.941823 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 21 12:32:48.941951 systemd-networkd[763]: eth0: Link UP Mar 21 12:32:48.941954 systemd-networkd[763]: eth0: Gained carrier Mar 21 12:32:48.941959 systemd-networkd[763]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 12:32:48.945798 systemd[1]: Reached target network.target - Network. Mar 21 12:32:48.962250 systemd-networkd[763]: eth0: DHCPv4 address 10.0.0.87/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 21 12:32:48.989921 ignition[670]: parsing config with SHA512: ef855bef57dfc3c7bb79e772bec1368fb1cb10d78e46caf235825e88bef55bc52ebf9e4df1469da330f81b22171ed688a20609a7db3f79195cc7dd61d5f2887f Mar 21 12:32:48.996167 unknown[670]: fetched base config from "system" Mar 21 12:32:48.996178 unknown[670]: fetched user config from "qemu" Mar 21 12:32:48.996608 ignition[670]: fetch-offline: fetch-offline passed Mar 21 12:32:48.998589 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 21 12:32:48.996683 ignition[670]: Ignition finished successfully Mar 21 12:32:48.999528 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 21 12:32:49.000191 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 21 12:32:49.029358 ignition[771]: Ignition 2.20.0 Mar 21 12:32:49.029367 ignition[771]: Stage: kargs Mar 21 12:32:49.029524 ignition[771]: no configs at "/usr/lib/ignition/base.d" Mar 21 12:32:49.029535 ignition[771]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 21 12:32:49.030355 ignition[771]: kargs: kargs passed Mar 21 12:32:49.030399 ignition[771]: Ignition finished successfully Mar 21 12:32:49.033278 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 21 12:32:49.035066 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 21 12:32:49.065144 ignition[779]: Ignition 2.20.0 Mar 21 12:32:49.065154 ignition[779]: Stage: disks Mar 21 12:32:49.065321 ignition[779]: no configs at "/usr/lib/ignition/base.d" Mar 21 12:32:49.065331 ignition[779]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 21 12:32:49.066154 ignition[779]: disks: disks passed Mar 21 12:32:49.067426 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 21 12:32:49.066197 ignition[779]: Ignition finished successfully Mar 21 12:32:49.068674 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 21 12:32:49.069942 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 21 12:32:49.071656 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 21 12:32:49.073073 systemd[1]: Reached target sysinit.target - System Initialization. Mar 21 12:32:49.074705 systemd[1]: Reached target basic.target - Basic System. Mar 21 12:32:49.077125 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 21 12:32:49.104068 systemd-fsck[790]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 21 12:32:49.107484 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 21 12:32:49.110243 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 21 12:32:49.159241 kernel: EXT4-fs (vda9): mounted filesystem 3004295c-1fab-4723-a953-2dc6fc131037 r/w with ordered data mode. Quota mode: none. Mar 21 12:32:49.159938 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 21 12:32:49.161124 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 21 12:32:49.163268 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 21 12:32:49.164784 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 21 12:32:49.165800 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 21 12:32:49.165847 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 21 12:32:49.165872 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 21 12:32:49.174950 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 21 12:32:49.177868 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 21 12:32:49.180384 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (798) Mar 21 12:32:49.180404 kernel: BTRFS info (device vda6): first mount of filesystem fea78075-4b56-496a-88c9-8f4cfa7493bf Mar 21 12:32:49.180414 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 21 12:32:49.180423 kernel: BTRFS info (device vda6): using free space tree Mar 21 12:32:49.183234 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 12:32:49.184001 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 21 12:32:49.223157 initrd-setup-root[823]: cut: /sysroot/etc/passwd: No such file or directory Mar 21 12:32:49.226351 initrd-setup-root[830]: cut: /sysroot/etc/group: No such file or directory Mar 21 12:32:49.230078 initrd-setup-root[837]: cut: /sysroot/etc/shadow: No such file or directory Mar 21 12:32:49.233661 initrd-setup-root[844]: cut: /sysroot/etc/gshadow: No such file or directory Mar 21 12:32:49.304368 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 21 12:32:49.306246 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 21 12:32:49.307532 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 21 12:32:49.336252 kernel: BTRFS info (device vda6): last unmount of filesystem fea78075-4b56-496a-88c9-8f4cfa7493bf Mar 21 12:32:49.350393 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 21 12:32:49.359511 ignition[914]: INFO : Ignition 2.20.0 Mar 21 12:32:49.359511 ignition[914]: INFO : Stage: mount Mar 21 12:32:49.360703 ignition[914]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 12:32:49.360703 ignition[914]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 21 12:32:49.360703 ignition[914]: INFO : mount: mount passed Mar 21 12:32:49.360703 ignition[914]: INFO : Ignition finished successfully Mar 21 12:32:49.363272 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 21 12:32:49.365012 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 21 12:32:49.922360 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 21 12:32:49.923854 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 21 12:32:49.941451 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (927) Mar 21 12:32:49.941482 kernel: BTRFS info (device vda6): first mount of filesystem fea78075-4b56-496a-88c9-8f4cfa7493bf Mar 21 12:32:49.941493 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 21 12:32:49.942608 kernel: BTRFS info (device vda6): using free space tree Mar 21 12:32:49.945238 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 12:32:49.945729 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 21 12:32:49.978066 ignition[944]: INFO : Ignition 2.20.0 Mar 21 12:32:49.978066 ignition[944]: INFO : Stage: files Mar 21 12:32:49.979257 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 12:32:49.979257 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 21 12:32:49.979257 ignition[944]: DEBUG : files: compiled without relabeling support, skipping Mar 21 12:32:49.981706 ignition[944]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 21 12:32:49.981706 ignition[944]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 21 12:32:49.983965 ignition[944]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 21 12:32:49.983965 ignition[944]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 21 12:32:49.986095 ignition[944]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 21 12:32:49.986095 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Mar 21 12:32:49.986095 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Mar 21 12:32:49.984080 unknown[944]: wrote ssh authorized keys file for user: core Mar 21 12:32:50.586403 systemd-networkd[763]: eth0: Gained IPv6LL Mar 21 12:32:51.005114 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 21 12:32:55.247968 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Mar 21 12:32:55.247968 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 21 12:32:55.251037 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-arm64.raw: attempt #1 Mar 21 12:32:55.615413 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 21 12:32:56.266415 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 21 12:32:56.266415 ignition[944]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 21 12:32:56.269302 ignition[944]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 21 12:32:56.285366 ignition[944]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 21 12:32:56.288600 ignition[944]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 21 12:32:56.289654 ignition[944]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 21 12:32:56.289654 ignition[944]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 21 12:32:56.289654 ignition[944]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 21 12:32:56.289654 ignition[944]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 21 12:32:56.289654 ignition[944]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 21 12:32:56.289654 ignition[944]: INFO : files: files passed Mar 21 12:32:56.289654 ignition[944]: INFO : Ignition finished successfully Mar 21 12:32:56.290451 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 21 12:32:56.292556 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 21 12:32:56.294758 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 21 12:32:56.312442 initrd-setup-root-after-ignition[971]: grep: /sysroot/oem/oem-release: No such file or directory Mar 21 12:32:56.313661 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 21 12:32:56.313737 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 21 12:32:56.318011 initrd-setup-root-after-ignition[975]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 21 12:32:56.318011 initrd-setup-root-after-ignition[975]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 21 12:32:56.320283 initrd-setup-root-after-ignition[979]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 21 12:32:56.320071 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 21 12:32:56.321249 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 21 12:32:56.323508 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 21 12:32:56.357335 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 21 12:32:56.357444 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 21 12:32:56.359008 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 21 12:32:56.360519 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 21 12:32:56.361806 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 21 12:32:56.362461 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 21 12:32:56.387744 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 21 12:32:56.389659 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 21 12:32:56.409402 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 21 12:32:56.410278 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 12:32:56.411779 systemd[1]: Stopped target timers.target - Timer Units. Mar 21 12:32:56.413017 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 21 12:32:56.413127 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 21 12:32:56.414895 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 21 12:32:56.416306 systemd[1]: Stopped target basic.target - Basic System. Mar 21 12:32:56.417510 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 21 12:32:56.418756 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 21 12:32:56.420109 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 21 12:32:56.421514 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 21 12:32:56.422881 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 21 12:32:56.424255 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 21 12:32:56.425674 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 21 12:32:56.426925 systemd[1]: Stopped target swap.target - Swaps. Mar 21 12:32:56.427988 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 21 12:32:56.428093 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 21 12:32:56.429859 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 21 12:32:56.431213 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 12:32:56.432602 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 21 12:32:56.433970 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 12:32:56.434919 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 21 12:32:56.435024 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 21 12:32:56.437795 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 21 12:32:56.437914 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 21 12:32:56.438872 systemd[1]: Stopped target paths.target - Path Units. Mar 21 12:32:56.440285 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 21 12:32:56.441143 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 12:32:56.442129 systemd[1]: Stopped target slices.target - Slice Units. Mar 21 12:32:56.443756 systemd[1]: Stopped target sockets.target - Socket Units. Mar 21 12:32:56.445111 systemd[1]: iscsid.socket: Deactivated successfully. Mar 21 12:32:56.445187 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 21 12:32:56.446781 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 21 12:32:56.446861 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 21 12:32:56.448826 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 21 12:32:56.448933 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 21 12:32:56.450382 systemd[1]: ignition-files.service: Deactivated successfully. Mar 21 12:32:56.450473 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 21 12:32:56.452502 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 21 12:32:56.453153 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 21 12:32:56.453290 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 12:32:56.455522 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 21 12:32:56.456177 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 21 12:32:56.456312 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 12:32:56.457943 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 21 12:32:56.458041 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 21 12:32:56.463676 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 21 12:32:56.464516 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 21 12:32:56.470859 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 21 12:32:56.475663 ignition[1000]: INFO : Ignition 2.20.0 Mar 21 12:32:56.475663 ignition[1000]: INFO : Stage: umount Mar 21 12:32:56.475663 ignition[1000]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 12:32:56.475663 ignition[1000]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 21 12:32:56.475663 ignition[1000]: INFO : umount: umount passed Mar 21 12:32:56.475663 ignition[1000]: INFO : Ignition finished successfully Mar 21 12:32:56.474957 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 21 12:32:56.475050 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 21 12:32:56.477595 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 21 12:32:56.477671 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 21 12:32:56.478993 systemd[1]: Stopped target network.target - Network. Mar 21 12:32:56.479871 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 21 12:32:56.479930 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 21 12:32:56.481180 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 21 12:32:56.481240 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 21 12:32:56.482665 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 21 12:32:56.482704 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 21 12:32:56.484261 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 21 12:32:56.484305 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 21 12:32:56.485643 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 21 12:32:56.485686 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 21 12:32:56.487483 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 21 12:32:56.488776 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 21 12:32:56.497279 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 21 12:32:56.498294 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 21 12:32:56.500926 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 21 12:32:56.501117 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 21 12:32:56.501206 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 21 12:32:56.503887 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 21 12:32:56.504753 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 21 12:32:56.504815 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 21 12:32:56.507196 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 21 12:32:56.508809 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 21 12:32:56.508869 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 21 12:32:56.510541 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 21 12:32:56.510587 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 21 12:32:56.512842 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 21 12:32:56.512887 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 21 12:32:56.514540 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 21 12:32:56.514585 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 12:32:56.517043 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 12:32:56.519531 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 21 12:32:56.519589 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 21 12:32:56.522988 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 21 12:32:56.523097 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 12:32:56.525733 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 21 12:32:56.525782 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 21 12:32:56.527021 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 21 12:32:56.527054 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 12:32:56.528875 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 21 12:32:56.528923 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 21 12:32:56.531151 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 21 12:32:56.531199 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 21 12:32:56.533423 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 21 12:32:56.533471 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 12:32:56.536410 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 21 12:32:56.538003 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 21 12:32:56.538062 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 12:32:56.540688 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 21 12:32:56.540731 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 12:32:56.543874 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 21 12:32:56.543927 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 21 12:32:56.546533 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 21 12:32:56.546628 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 21 12:32:56.550753 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 21 12:32:56.550851 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 21 12:32:56.552707 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 21 12:32:56.554823 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 21 12:32:56.573489 systemd[1]: Switching root. Mar 21 12:32:56.601428 systemd-journald[237]: Journal stopped Mar 21 12:32:57.376015 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). Mar 21 12:32:57.376081 kernel: SELinux: policy capability network_peer_controls=1 Mar 21 12:32:57.376096 kernel: SELinux: policy capability open_perms=1 Mar 21 12:32:57.376114 kernel: SELinux: policy capability extended_socket_class=1 Mar 21 12:32:57.376124 kernel: SELinux: policy capability always_check_network=0 Mar 21 12:32:57.376133 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 21 12:32:57.376144 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 21 12:32:57.376158 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 21 12:32:57.376167 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 21 12:32:57.376177 kernel: audit: type=1403 audit(1742560376.799:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 21 12:32:57.376188 systemd[1]: Successfully loaded SELinux policy in 29.964ms. Mar 21 12:32:57.376205 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.359ms. Mar 21 12:32:57.376229 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 21 12:32:57.376241 systemd[1]: Detected virtualization kvm. Mar 21 12:32:57.376252 systemd[1]: Detected architecture arm64. Mar 21 12:32:57.376262 systemd[1]: Detected first boot. Mar 21 12:32:57.376272 systemd[1]: Initializing machine ID from VM UUID. Mar 21 12:32:57.376283 zram_generator::config[1047]: No configuration found. Mar 21 12:32:57.376294 kernel: NET: Registered PF_VSOCK protocol family Mar 21 12:32:57.376304 systemd[1]: Populated /etc with preset unit settings. Mar 21 12:32:57.376318 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 21 12:32:57.376328 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 21 12:32:57.376339 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 21 12:32:57.376351 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 21 12:32:57.376362 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 21 12:32:57.376374 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 21 12:32:57.376385 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 21 12:32:57.376414 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 21 12:32:57.376425 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 21 12:32:57.376439 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 21 12:32:57.376450 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 21 12:32:57.376461 systemd[1]: Created slice user.slice - User and Session Slice. Mar 21 12:32:57.376471 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 12:32:57.376482 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 12:32:57.376493 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 21 12:32:57.376504 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 21 12:32:57.376515 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 21 12:32:57.376527 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 21 12:32:57.376538 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 21 12:32:57.376549 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 12:32:57.376560 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 21 12:32:57.376571 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 21 12:32:57.376582 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 21 12:32:57.376592 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 21 12:32:57.376605 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 12:32:57.376617 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 21 12:32:57.376627 systemd[1]: Reached target slices.target - Slice Units. Mar 21 12:32:57.376638 systemd[1]: Reached target swap.target - Swaps. Mar 21 12:32:57.376649 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 21 12:32:57.376659 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 21 12:32:57.376670 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 21 12:32:57.376681 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 21 12:32:57.376692 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 21 12:32:57.376702 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 12:32:57.376712 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 21 12:32:57.376725 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 21 12:32:57.376735 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 21 12:32:57.376746 systemd[1]: Mounting media.mount - External Media Directory... Mar 21 12:32:57.376757 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 21 12:32:57.376767 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 21 12:32:57.376778 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 21 12:32:57.376789 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 21 12:32:57.376805 systemd[1]: Reached target machines.target - Containers. Mar 21 12:32:57.376818 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 21 12:32:57.376829 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 12:32:57.376841 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 21 12:32:57.376852 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 21 12:32:57.376862 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 12:32:57.376873 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 21 12:32:57.376884 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 12:32:57.376894 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 21 12:32:57.376906 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 12:32:57.376917 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 21 12:32:57.376928 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 21 12:32:57.376939 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 21 12:32:57.376950 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 21 12:32:57.376960 systemd[1]: Stopped systemd-fsck-usr.service. Mar 21 12:32:57.376971 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 12:32:57.376981 kernel: fuse: init (API version 7.39) Mar 21 12:32:57.376991 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 21 12:32:57.377003 kernel: loop: module loaded Mar 21 12:32:57.377013 kernel: ACPI: bus type drm_connector registered Mar 21 12:32:57.377023 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 21 12:32:57.377033 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 21 12:32:57.377044 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 21 12:32:57.377055 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 21 12:32:57.377066 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 21 12:32:57.377098 systemd-journald[1123]: Collecting audit messages is disabled. Mar 21 12:32:57.377124 systemd-journald[1123]: Journal started Mar 21 12:32:57.377145 systemd-journald[1123]: Runtime Journal (/run/log/journal/bc92f46de7404d50a79ac2638a16ee21) is 5.9M, max 47.3M, 41.4M free. Mar 21 12:32:57.181256 systemd[1]: Queued start job for default target multi-user.target. Mar 21 12:32:57.198259 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 21 12:32:57.198665 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 21 12:32:57.378464 systemd[1]: verity-setup.service: Deactivated successfully. Mar 21 12:32:57.378491 systemd[1]: Stopped verity-setup.service. Mar 21 12:32:57.383500 systemd[1]: Started systemd-journald.service - Journal Service. Mar 21 12:32:57.384182 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 21 12:32:57.385397 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 21 12:32:57.386650 systemd[1]: Mounted media.mount - External Media Directory. Mar 21 12:32:57.387829 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 21 12:32:57.389077 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 21 12:32:57.390299 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 21 12:32:57.391526 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 21 12:32:57.392991 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 12:32:57.394492 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 21 12:32:57.394669 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 21 12:32:57.396101 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 12:32:57.396295 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 12:32:57.397929 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 21 12:32:57.398090 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 21 12:32:57.401550 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 12:32:57.401715 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 12:32:57.403169 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 21 12:32:57.403359 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 21 12:32:57.404665 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 12:32:57.404850 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 12:32:57.406205 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 21 12:32:57.407720 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 21 12:32:57.409212 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 21 12:32:57.411762 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 21 12:32:57.423759 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 21 12:32:57.426348 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 21 12:32:57.428574 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 21 12:32:57.429720 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 21 12:32:57.429775 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 21 12:32:57.431841 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 21 12:32:57.441090 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 21 12:32:57.443163 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 21 12:32:57.444286 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 12:32:57.445577 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 21 12:32:57.447426 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 21 12:32:57.448627 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 21 12:32:57.452342 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 21 12:32:57.453171 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 21 12:32:57.454486 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 21 12:32:57.456768 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 21 12:32:57.457440 systemd-journald[1123]: Time spent on flushing to /var/log/journal/bc92f46de7404d50a79ac2638a16ee21 is 14.005ms for 867 entries. Mar 21 12:32:57.457440 systemd-journald[1123]: System Journal (/var/log/journal/bc92f46de7404d50a79ac2638a16ee21) is 8M, max 195.6M, 187.6M free. Mar 21 12:32:57.480971 systemd-journald[1123]: Received client request to flush runtime journal. Mar 21 12:32:57.481010 kernel: loop0: detected capacity change from 0 to 103832 Mar 21 12:32:57.459453 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 21 12:32:57.465305 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 12:32:57.466598 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 21 12:32:57.468598 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 21 12:32:57.470412 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 21 12:32:57.476453 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 21 12:32:57.482044 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 21 12:32:57.483503 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 21 12:32:57.489314 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 21 12:32:57.493298 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 21 12:32:57.495563 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 21 12:32:57.499708 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 21 12:32:57.504014 udevadm[1173]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 21 12:32:57.507411 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 21 12:32:57.513513 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 21 12:32:57.519249 kernel: loop1: detected capacity change from 0 to 201592 Mar 21 12:32:57.527792 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 21 12:32:57.541873 systemd-tmpfiles[1185]: ACLs are not supported, ignoring. Mar 21 12:32:57.541890 systemd-tmpfiles[1185]: ACLs are not supported, ignoring. Mar 21 12:32:57.547012 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 12:32:57.554278 kernel: loop2: detected capacity change from 0 to 126448 Mar 21 12:32:57.592238 kernel: loop3: detected capacity change from 0 to 103832 Mar 21 12:32:57.598236 kernel: loop4: detected capacity change from 0 to 201592 Mar 21 12:32:57.604240 kernel: loop5: detected capacity change from 0 to 126448 Mar 21 12:32:57.607530 (sd-merge)[1191]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 21 12:32:57.607957 (sd-merge)[1191]: Merged extensions into '/usr'. Mar 21 12:32:57.612551 systemd[1]: Reload requested from client PID 1166 ('systemd-sysext') (unit systemd-sysext.service)... Mar 21 12:32:57.612570 systemd[1]: Reloading... Mar 21 12:32:57.682242 zram_generator::config[1222]: No configuration found. Mar 21 12:32:57.738973 ldconfig[1161]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 21 12:32:57.776416 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 12:32:57.825666 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 21 12:32:57.826010 systemd[1]: Reloading finished in 213 ms. Mar 21 12:32:57.849003 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 21 12:32:57.850502 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 21 12:32:57.861653 systemd[1]: Starting ensure-sysext.service... Mar 21 12:32:57.863613 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 21 12:32:57.875590 systemd[1]: Reload requested from client PID 1253 ('systemctl') (unit ensure-sysext.service)... Mar 21 12:32:57.875607 systemd[1]: Reloading... Mar 21 12:32:57.881955 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 21 12:32:57.882523 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 21 12:32:57.883277 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 21 12:32:57.883589 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. Mar 21 12:32:57.883709 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. Mar 21 12:32:57.886368 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. Mar 21 12:32:57.886481 systemd-tmpfiles[1254]: Skipping /boot Mar 21 12:32:57.895767 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. Mar 21 12:32:57.895919 systemd-tmpfiles[1254]: Skipping /boot Mar 21 12:32:57.938302 zram_generator::config[1286]: No configuration found. Mar 21 12:32:58.020063 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 12:32:58.069813 systemd[1]: Reloading finished in 193 ms. Mar 21 12:32:58.082274 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 21 12:32:58.087953 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 12:32:58.108140 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 21 12:32:58.110508 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 21 12:32:58.112559 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 21 12:32:58.116985 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 21 12:32:58.120810 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 12:32:58.123459 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 21 12:32:58.142656 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 12:32:58.144209 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 12:32:58.147824 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 12:32:58.151367 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 12:32:58.152923 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 12:32:58.153050 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 12:32:58.163507 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 21 12:32:58.165873 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 21 12:32:58.167639 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 21 12:32:58.169481 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 12:32:58.171265 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 12:32:58.172922 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 12:32:58.173101 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 12:32:58.174985 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 12:32:58.175155 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 12:32:58.177536 systemd-udevd[1324]: Using default interface naming scheme 'v255'. Mar 21 12:32:58.183895 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 12:32:58.185338 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 12:32:58.191469 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 12:32:58.202533 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 12:32:58.203381 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 12:32:58.203578 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 12:32:58.204970 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 21 12:32:58.205770 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 21 12:32:58.208864 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 12:32:58.211180 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 12:32:58.214788 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 12:32:58.216311 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 12:32:58.216494 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 12:32:58.217757 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 21 12:32:58.219725 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 12:32:58.219955 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 12:32:58.225353 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 21 12:32:58.225540 augenrules[1377]: No rules Mar 21 12:32:58.227071 systemd[1]: audit-rules.service: Deactivated successfully. Mar 21 12:32:58.228383 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 21 12:32:58.250463 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 21 12:32:58.251769 systemd[1]: Finished ensure-sysext.service. Mar 21 12:32:58.256184 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 21 12:32:58.257898 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 21 12:32:58.259541 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 12:32:58.261540 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 12:32:58.265191 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 21 12:32:58.269173 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 12:32:58.275016 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 12:32:58.277460 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 12:32:58.277506 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 12:32:58.283382 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 21 12:32:58.286482 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 21 12:32:58.287406 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 21 12:32:58.288050 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 12:32:58.288255 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 12:32:58.289677 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 21 12:32:58.290391 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 21 12:32:58.291593 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 12:32:58.291747 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 12:32:58.296025 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 12:32:58.296367 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 12:32:58.302390 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 21 12:32:58.302457 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 21 12:32:58.310424 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1371) Mar 21 12:32:58.313387 augenrules[1394]: /sbin/augenrules: No change Mar 21 12:32:58.320010 systemd-resolved[1322]: Positive Trust Anchors: Mar 21 12:32:58.320028 systemd-resolved[1322]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 21 12:32:58.320059 systemd-resolved[1322]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 21 12:32:58.335454 augenrules[1426]: No rules Mar 21 12:32:58.336899 systemd[1]: audit-rules.service: Deactivated successfully. Mar 21 12:32:58.337128 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 21 12:32:58.338625 systemd-resolved[1322]: Defaulting to hostname 'linux'. Mar 21 12:32:58.344541 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 21 12:32:58.345519 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 21 12:32:58.353463 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 21 12:32:58.360325 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 21 12:32:58.368502 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 21 12:32:58.369769 systemd[1]: Reached target time-set.target - System Time Set. Mar 21 12:32:58.384652 systemd-networkd[1399]: lo: Link UP Mar 21 12:32:58.384663 systemd-networkd[1399]: lo: Gained carrier Mar 21 12:32:58.388269 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 21 12:32:58.390362 systemd-networkd[1399]: Enumeration completed Mar 21 12:32:58.392633 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 21 12:32:58.393663 systemd[1]: Reached target network.target - Network. Mar 21 12:32:58.395720 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 21 12:32:58.399935 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 21 12:32:58.403396 systemd-networkd[1399]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 12:32:58.403407 systemd-networkd[1399]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 21 12:32:58.403949 systemd-networkd[1399]: eth0: Link UP Mar 21 12:32:58.403953 systemd-networkd[1399]: eth0: Gained carrier Mar 21 12:32:58.403968 systemd-networkd[1399]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 12:32:58.405759 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 12:32:58.418862 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 21 12:32:58.421298 systemd-networkd[1399]: eth0: DHCPv4 address 10.0.0.87/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 21 12:32:58.421938 systemd-timesyncd[1402]: Network configuration changed, trying to establish connection. Mar 21 12:32:58.424049 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 21 12:32:57.957898 systemd-resolved[1322]: Clock change detected. Flushing caches. Mar 21 12:32:57.963246 systemd-journald[1123]: Time jumped backwards, rotating. Mar 21 12:32:57.958171 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 21 12:32:57.958219 systemd-timesyncd[1402]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 21 12:32:57.958273 systemd-timesyncd[1402]: Initial clock synchronization to Fri 2025-03-21 12:32:57.957824 UTC. Mar 21 12:32:57.982955 lvm[1445]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 21 12:32:57.996347 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 12:32:58.019546 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 21 12:32:58.020721 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 21 12:32:58.021574 systemd[1]: Reached target sysinit.target - System Initialization. Mar 21 12:32:58.022622 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 21 12:32:58.023564 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 21 12:32:58.024624 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 21 12:32:58.025483 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 21 12:32:58.026606 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 21 12:32:58.027506 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 21 12:32:58.027536 systemd[1]: Reached target paths.target - Path Units. Mar 21 12:32:58.028183 systemd[1]: Reached target timers.target - Timer Units. Mar 21 12:32:58.029801 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 21 12:32:58.032243 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 21 12:32:58.035207 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 21 12:32:58.036279 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 21 12:32:58.037179 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 21 12:32:58.041153 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 21 12:32:58.042328 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 21 12:32:58.044409 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 21 12:32:58.045777 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 21 12:32:58.046670 systemd[1]: Reached target sockets.target - Socket Units. Mar 21 12:32:58.047424 systemd[1]: Reached target basic.target - Basic System. Mar 21 12:32:58.048119 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 21 12:32:58.048148 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 21 12:32:58.049170 systemd[1]: Starting containerd.service - containerd container runtime... Mar 21 12:32:58.050909 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 21 12:32:58.052134 lvm[1455]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 21 12:32:58.052823 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 21 12:32:58.058183 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 21 12:32:58.059581 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 21 12:32:58.063375 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 21 12:32:58.067665 jq[1458]: false Mar 21 12:32:58.065567 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 21 12:32:58.073136 dbus-daemon[1457]: [system] SELinux support is enabled Mar 21 12:32:58.076445 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 21 12:32:58.079006 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 21 12:32:58.082805 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 21 12:32:58.084285 extend-filesystems[1459]: Found loop3 Mar 21 12:32:58.085316 extend-filesystems[1459]: Found loop4 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found loop5 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda1 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda2 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda3 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found usr Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda4 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda6 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda7 Mar 21 12:32:58.086342 extend-filesystems[1459]: Found vda9 Mar 21 12:32:58.086342 extend-filesystems[1459]: Checking size of /dev/vda9 Mar 21 12:32:58.085830 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 21 12:32:58.086356 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 21 12:32:58.087692 systemd[1]: Starting update-engine.service - Update Engine... Mar 21 12:32:58.093017 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 21 12:32:58.096025 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 21 12:32:58.100226 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 21 12:32:58.105348 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 21 12:32:58.105577 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 21 12:32:58.105844 systemd[1]: motdgen.service: Deactivated successfully. Mar 21 12:32:58.109303 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 21 12:32:58.113777 extend-filesystems[1459]: Resized partition /dev/vda9 Mar 21 12:32:58.118557 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 21 12:32:58.120357 extend-filesystems[1481]: resize2fs 1.47.2 (1-Jan-2025) Mar 21 12:32:58.125132 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1378) Mar 21 12:32:58.125181 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 21 12:32:58.125194 jq[1475]: true Mar 21 12:32:58.118746 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 21 12:32:58.144754 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 21 12:32:58.144802 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 21 12:32:58.146352 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 21 12:32:58.146376 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 21 12:32:58.148946 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 21 12:32:58.153462 update_engine[1469]: I20250321 12:32:58.153312 1469 main.cc:92] Flatcar Update Engine starting Mar 21 12:32:58.162756 update_engine[1469]: I20250321 12:32:58.159312 1469 update_check_scheduler.cc:74] Next update check in 5m27s Mar 21 12:32:58.158263 (ntainerd)[1483]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 21 12:32:58.163066 tar[1482]: linux-arm64/LICENSE Mar 21 12:32:58.163066 tar[1482]: linux-arm64/helm Mar 21 12:32:58.160950 systemd[1]: Started update-engine.service - Update Engine. Mar 21 12:32:58.163265 jq[1484]: true Mar 21 12:32:58.165060 extend-filesystems[1481]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 21 12:32:58.165060 extend-filesystems[1481]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 21 12:32:58.165060 extend-filesystems[1481]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 21 12:32:58.172165 extend-filesystems[1459]: Resized filesystem in /dev/vda9 Mar 21 12:32:58.173327 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 21 12:32:58.173534 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 21 12:32:58.176658 systemd-logind[1467]: Watching system buttons on /dev/input/event0 (Power Button) Mar 21 12:32:58.177251 systemd-logind[1467]: New seat seat0. Mar 21 12:32:58.180395 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 21 12:32:58.181664 systemd[1]: Started systemd-logind.service - User Login Management. Mar 21 12:32:58.228075 bash[1513]: Updated "/home/core/.ssh/authorized_keys" Mar 21 12:32:58.232966 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 21 12:32:58.234875 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 21 12:32:58.258180 locksmithd[1499]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 21 12:32:58.376558 containerd[1483]: time="2025-03-21T12:32:58Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 21 12:32:58.378084 containerd[1483]: time="2025-03-21T12:32:58.377895174Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 21 12:32:58.391019 containerd[1483]: time="2025-03-21T12:32:58.390967054Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.68µs" Mar 21 12:32:58.391079 containerd[1483]: time="2025-03-21T12:32:58.391019454Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 21 12:32:58.391079 containerd[1483]: time="2025-03-21T12:32:58.391046254Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 21 12:32:58.391849 containerd[1483]: time="2025-03-21T12:32:58.391814614Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 21 12:32:58.391947 containerd[1483]: time="2025-03-21T12:32:58.391858014Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 21 12:32:58.391987 containerd[1483]: time="2025-03-21T12:32:58.391970974Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392061 containerd[1483]: time="2025-03-21T12:32:58.392042254Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392098 containerd[1483]: time="2025-03-21T12:32:58.392061054Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392513 containerd[1483]: time="2025-03-21T12:32:58.392479734Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392538 containerd[1483]: time="2025-03-21T12:32:58.392511854Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392538 containerd[1483]: time="2025-03-21T12:32:58.392526094Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392538 containerd[1483]: time="2025-03-21T12:32:58.392535054Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392702 containerd[1483]: time="2025-03-21T12:32:58.392680574Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 21 12:32:58.392987 containerd[1483]: time="2025-03-21T12:32:58.392965134Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 21 12:32:58.393020 containerd[1483]: time="2025-03-21T12:32:58.393004694Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 21 12:32:58.393052 containerd[1483]: time="2025-03-21T12:32:58.393018814Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 21 12:32:58.394481 containerd[1483]: time="2025-03-21T12:32:58.394451974Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 21 12:32:58.394884 containerd[1483]: time="2025-03-21T12:32:58.394859494Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 21 12:32:58.395034 containerd[1483]: time="2025-03-21T12:32:58.395013934Z" level=info msg="metadata content store policy set" policy=shared Mar 21 12:32:58.398612 containerd[1483]: time="2025-03-21T12:32:58.398573854Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 21 12:32:58.398680 containerd[1483]: time="2025-03-21T12:32:58.398631014Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 21 12:32:58.398680 containerd[1483]: time="2025-03-21T12:32:58.398659854Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 21 12:32:58.398680 containerd[1483]: time="2025-03-21T12:32:58.398672814Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 21 12:32:58.398734 containerd[1483]: time="2025-03-21T12:32:58.398685694Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 21 12:32:58.398797 containerd[1483]: time="2025-03-21T12:32:58.398774534Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 21 12:32:58.398821 containerd[1483]: time="2025-03-21T12:32:58.398800534Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 21 12:32:58.398821 containerd[1483]: time="2025-03-21T12:32:58.398814854Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 21 12:32:58.398861 containerd[1483]: time="2025-03-21T12:32:58.398829934Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 21 12:32:58.398861 containerd[1483]: time="2025-03-21T12:32:58.398841414Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 21 12:32:58.398861 containerd[1483]: time="2025-03-21T12:32:58.398853654Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 21 12:32:58.398909 containerd[1483]: time="2025-03-21T12:32:58.398866374Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 21 12:32:58.399039 containerd[1483]: time="2025-03-21T12:32:58.399017934Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 21 12:32:58.399063 containerd[1483]: time="2025-03-21T12:32:58.399047294Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 21 12:32:58.399082 containerd[1483]: time="2025-03-21T12:32:58.399062054Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 21 12:32:58.399112 containerd[1483]: time="2025-03-21T12:32:58.399081774Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 21 12:32:58.399112 containerd[1483]: time="2025-03-21T12:32:58.399093814Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 21 12:32:58.399112 containerd[1483]: time="2025-03-21T12:32:58.399105614Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 21 12:32:58.399164 containerd[1483]: time="2025-03-21T12:32:58.399117334Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 21 12:32:58.399164 containerd[1483]: time="2025-03-21T12:32:58.399128974Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 21 12:32:58.399164 containerd[1483]: time="2025-03-21T12:32:58.399141494Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 21 12:32:58.399164 containerd[1483]: time="2025-03-21T12:32:58.399153774Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 21 12:32:58.399271 containerd[1483]: time="2025-03-21T12:32:58.399167214Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 21 12:32:58.399447 containerd[1483]: time="2025-03-21T12:32:58.399428174Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 21 12:32:58.399471 containerd[1483]: time="2025-03-21T12:32:58.399449934Z" level=info msg="Start snapshots syncer" Mar 21 12:32:58.399503 containerd[1483]: time="2025-03-21T12:32:58.399478734Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 21 12:32:58.399785 containerd[1483]: time="2025-03-21T12:32:58.399745774Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 21 12:32:58.399877 containerd[1483]: time="2025-03-21T12:32:58.399800414Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 21 12:32:58.399877 containerd[1483]: time="2025-03-21T12:32:58.399864294Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 21 12:32:58.400010 containerd[1483]: time="2025-03-21T12:32:58.399989654Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 21 12:32:58.400046 containerd[1483]: time="2025-03-21T12:32:58.400027854Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 21 12:32:58.400066 containerd[1483]: time="2025-03-21T12:32:58.400042294Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 21 12:32:58.400066 containerd[1483]: time="2025-03-21T12:32:58.400060454Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 21 12:32:58.400099 containerd[1483]: time="2025-03-21T12:32:58.400072934Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 21 12:32:58.400099 containerd[1483]: time="2025-03-21T12:32:58.400083894Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 21 12:32:58.400132 containerd[1483]: time="2025-03-21T12:32:58.400098894Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 21 12:32:58.400132 containerd[1483]: time="2025-03-21T12:32:58.400127294Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 21 12:32:58.400168 containerd[1483]: time="2025-03-21T12:32:58.400140174Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 21 12:32:58.400168 containerd[1483]: time="2025-03-21T12:32:58.400150334Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 21 12:32:58.400202 containerd[1483]: time="2025-03-21T12:32:58.400182334Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 21 12:32:58.400219 containerd[1483]: time="2025-03-21T12:32:58.400198094Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 21 12:32:58.400219 containerd[1483]: time="2025-03-21T12:32:58.400208174Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 21 12:32:58.400253 containerd[1483]: time="2025-03-21T12:32:58.400217934Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 21 12:32:58.400253 containerd[1483]: time="2025-03-21T12:32:58.400226014Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 21 12:32:58.400253 containerd[1483]: time="2025-03-21T12:32:58.400236054Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 21 12:32:58.400253 containerd[1483]: time="2025-03-21T12:32:58.400247334Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 21 12:32:58.400340 containerd[1483]: time="2025-03-21T12:32:58.400325254Z" level=info msg="runtime interface created" Mar 21 12:32:58.400340 containerd[1483]: time="2025-03-21T12:32:58.400335254Z" level=info msg="created NRI interface" Mar 21 12:32:58.400378 containerd[1483]: time="2025-03-21T12:32:58.400344054Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 21 12:32:58.400378 containerd[1483]: time="2025-03-21T12:32:58.400356894Z" level=info msg="Connect containerd service" Mar 21 12:32:58.400415 containerd[1483]: time="2025-03-21T12:32:58.400383774Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 21 12:32:58.402944 containerd[1483]: time="2025-03-21T12:32:58.402656294Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 21 12:32:58.484009 sshd_keygen[1478]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 21 12:32:58.505426 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 21 12:32:58.508951 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 21 12:32:58.526460 systemd[1]: issuegen.service: Deactivated successfully. Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.525673814Z" level=info msg="Start subscribing containerd event" Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.525820414Z" level=info msg="Start recovering state" Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.525709654Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.525964454Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.525991014Z" level=info msg="Start event monitor" Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.526009774Z" level=info msg="Start cni network conf syncer for default" Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.526017934Z" level=info msg="Start streaming server" Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.526033974Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.526042374Z" level=info msg="runtime interface starting up..." Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.526048614Z" level=info msg="starting plugins..." Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.526074054Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 21 12:32:58.526802 containerd[1483]: time="2025-03-21T12:32:58.526784894Z" level=info msg="containerd successfully booted in 0.150598s" Mar 21 12:32:58.527480 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 21 12:32:58.529240 systemd[1]: Started containerd.service - containerd container runtime. Mar 21 12:32:58.533157 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 21 12:32:58.551439 tar[1482]: linux-arm64/README.md Mar 21 12:32:58.564425 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 21 12:32:58.568027 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 21 12:32:58.570173 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 21 12:32:58.571196 systemd[1]: Reached target getty.target - Login Prompts. Mar 21 12:32:58.572997 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 21 12:32:59.100322 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 21 12:32:59.102825 systemd[1]: Started sshd@0-10.0.0.87:22-10.0.0.1:47656.service - OpenSSH per-connection server daemon (10.0.0.1:47656). Mar 21 12:32:59.184556 sshd[1563]: Accepted publickey for core from 10.0.0.1 port 47656 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:32:59.186531 sshd-session[1563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:32:59.197582 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 21 12:32:59.199774 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 21 12:32:59.205828 systemd-logind[1467]: New session 1 of user core. Mar 21 12:32:59.221301 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 21 12:32:59.225125 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 21 12:32:59.240075 (systemd)[1567]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 21 12:32:59.242183 systemd-logind[1467]: New session c1 of user core. Mar 21 12:32:59.349423 systemd[1567]: Queued start job for default target default.target. Mar 21 12:32:59.358041 systemd[1567]: Created slice app.slice - User Application Slice. Mar 21 12:32:59.358075 systemd[1567]: Reached target paths.target - Paths. Mar 21 12:32:59.358115 systemd[1567]: Reached target timers.target - Timers. Mar 21 12:32:59.359414 systemd[1567]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 21 12:32:59.369206 systemd[1567]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 21 12:32:59.369277 systemd[1567]: Reached target sockets.target - Sockets. Mar 21 12:32:59.369319 systemd[1567]: Reached target basic.target - Basic System. Mar 21 12:32:59.369346 systemd[1567]: Reached target default.target - Main User Target. Mar 21 12:32:59.369371 systemd[1567]: Startup finished in 121ms. Mar 21 12:32:59.369635 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 21 12:32:59.372010 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 21 12:32:59.433237 systemd[1]: Started sshd@1-10.0.0.87:22-10.0.0.1:47672.service - OpenSSH per-connection server daemon (10.0.0.1:47672). Mar 21 12:32:59.496217 sshd[1578]: Accepted publickey for core from 10.0.0.1 port 47672 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:32:59.497474 sshd-session[1578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:32:59.501589 systemd-logind[1467]: New session 2 of user core. Mar 21 12:32:59.510091 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 21 12:32:59.561304 sshd[1580]: Connection closed by 10.0.0.1 port 47672 Mar 21 12:32:59.561744 sshd-session[1578]: pam_unix(sshd:session): session closed for user core Mar 21 12:32:59.571298 systemd[1]: sshd@1-10.0.0.87:22-10.0.0.1:47672.service: Deactivated successfully. Mar 21 12:32:59.572811 systemd[1]: session-2.scope: Deactivated successfully. Mar 21 12:32:59.574605 systemd-logind[1467]: Session 2 logged out. Waiting for processes to exit. Mar 21 12:32:59.575353 systemd[1]: Started sshd@2-10.0.0.87:22-10.0.0.1:47688.service - OpenSSH per-connection server daemon (10.0.0.1:47688). Mar 21 12:32:59.577135 systemd-logind[1467]: Removed session 2. Mar 21 12:32:59.592029 systemd-networkd[1399]: eth0: Gained IPv6LL Mar 21 12:32:59.597645 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 21 12:32:59.599589 systemd[1]: Reached target network-online.target - Network is Online. Mar 21 12:32:59.602271 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 21 12:32:59.604783 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 12:32:59.617143 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 21 12:32:59.634195 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 21 12:32:59.635408 sshd[1585]: Accepted publickey for core from 10.0.0.1 port 47688 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:32:59.635874 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 21 12:32:59.636874 sshd-session[1585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:32:59.637362 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 21 12:32:59.643911 systemd-logind[1467]: New session 3 of user core. Mar 21 12:32:59.652102 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 21 12:32:59.653368 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 21 12:32:59.705470 sshd[1606]: Connection closed by 10.0.0.1 port 47688 Mar 21 12:32:59.705954 sshd-session[1585]: pam_unix(sshd:session): session closed for user core Mar 21 12:32:59.709712 systemd[1]: sshd@2-10.0.0.87:22-10.0.0.1:47688.service: Deactivated successfully. Mar 21 12:32:59.711453 systemd[1]: session-3.scope: Deactivated successfully. Mar 21 12:32:59.712094 systemd-logind[1467]: Session 3 logged out. Waiting for processes to exit. Mar 21 12:32:59.713329 systemd-logind[1467]: Removed session 3. Mar 21 12:33:00.135411 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:00.137005 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 21 12:33:00.139440 (kubelet)[1616]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 12:33:00.143355 systemd[1]: Startup finished in 522ms (kernel) + 10.102s (initrd) + 3.846s (userspace) = 14.471s. Mar 21 12:33:00.549415 kubelet[1616]: E0321 12:33:00.549291 1616 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 12:33:00.552338 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 12:33:00.552503 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 12:33:00.552964 systemd[1]: kubelet.service: Consumed 802ms CPU time, 250.2M memory peak. Mar 21 12:33:09.716864 systemd[1]: Started sshd@3-10.0.0.87:22-10.0.0.1:46472.service - OpenSSH per-connection server daemon (10.0.0.1:46472). Mar 21 12:33:09.765518 sshd[1630]: Accepted publickey for core from 10.0.0.1 port 46472 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:33:09.766828 sshd-session[1630]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:33:09.770438 systemd-logind[1467]: New session 4 of user core. Mar 21 12:33:09.781097 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 21 12:33:09.832453 sshd[1632]: Connection closed by 10.0.0.1 port 46472 Mar 21 12:33:09.832959 sshd-session[1630]: pam_unix(sshd:session): session closed for user core Mar 21 12:33:09.847831 systemd[1]: sshd@3-10.0.0.87:22-10.0.0.1:46472.service: Deactivated successfully. Mar 21 12:33:09.849417 systemd[1]: session-4.scope: Deactivated successfully. Mar 21 12:33:09.850114 systemd-logind[1467]: Session 4 logged out. Waiting for processes to exit. Mar 21 12:33:09.851981 systemd[1]: Started sshd@4-10.0.0.87:22-10.0.0.1:46480.service - OpenSSH per-connection server daemon (10.0.0.1:46480). Mar 21 12:33:09.852811 systemd-logind[1467]: Removed session 4. Mar 21 12:33:09.907915 sshd[1637]: Accepted publickey for core from 10.0.0.1 port 46480 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:33:09.909162 sshd-session[1637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:33:09.913671 systemd-logind[1467]: New session 5 of user core. Mar 21 12:33:09.927088 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 21 12:33:09.975711 sshd[1640]: Connection closed by 10.0.0.1 port 46480 Mar 21 12:33:09.975519 sshd-session[1637]: pam_unix(sshd:session): session closed for user core Mar 21 12:33:09.985314 systemd[1]: sshd@4-10.0.0.87:22-10.0.0.1:46480.service: Deactivated successfully. Mar 21 12:33:09.986897 systemd[1]: session-5.scope: Deactivated successfully. Mar 21 12:33:09.987564 systemd-logind[1467]: Session 5 logged out. Waiting for processes to exit. Mar 21 12:33:09.989454 systemd[1]: Started sshd@5-10.0.0.87:22-10.0.0.1:46496.service - OpenSSH per-connection server daemon (10.0.0.1:46496). Mar 21 12:33:09.990173 systemd-logind[1467]: Removed session 5. Mar 21 12:33:10.042091 sshd[1645]: Accepted publickey for core from 10.0.0.1 port 46496 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:33:10.043540 sshd-session[1645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:33:10.047357 systemd-logind[1467]: New session 6 of user core. Mar 21 12:33:10.055083 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 21 12:33:10.106763 sshd[1648]: Connection closed by 10.0.0.1 port 46496 Mar 21 12:33:10.107115 sshd-session[1645]: pam_unix(sshd:session): session closed for user core Mar 21 12:33:10.126085 systemd[1]: sshd@5-10.0.0.87:22-10.0.0.1:46496.service: Deactivated successfully. Mar 21 12:33:10.127840 systemd[1]: session-6.scope: Deactivated successfully. Mar 21 12:33:10.129256 systemd-logind[1467]: Session 6 logged out. Waiting for processes to exit. Mar 21 12:33:10.130532 systemd[1]: Started sshd@6-10.0.0.87:22-10.0.0.1:46512.service - OpenSSH per-connection server daemon (10.0.0.1:46512). Mar 21 12:33:10.131236 systemd-logind[1467]: Removed session 6. Mar 21 12:33:10.189231 sshd[1653]: Accepted publickey for core from 10.0.0.1 port 46512 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:33:10.190369 sshd-session[1653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:33:10.193997 systemd-logind[1467]: New session 7 of user core. Mar 21 12:33:10.200126 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 21 12:33:10.260518 sudo[1657]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 21 12:33:10.260774 sudo[1657]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 12:33:10.280685 sudo[1657]: pam_unix(sudo:session): session closed for user root Mar 21 12:33:10.281966 sshd[1656]: Connection closed by 10.0.0.1 port 46512 Mar 21 12:33:10.282333 sshd-session[1653]: pam_unix(sshd:session): session closed for user core Mar 21 12:33:10.292179 systemd[1]: sshd@6-10.0.0.87:22-10.0.0.1:46512.service: Deactivated successfully. Mar 21 12:33:10.293655 systemd[1]: session-7.scope: Deactivated successfully. Mar 21 12:33:10.294316 systemd-logind[1467]: Session 7 logged out. Waiting for processes to exit. Mar 21 12:33:10.296053 systemd[1]: Started sshd@7-10.0.0.87:22-10.0.0.1:46528.service - OpenSSH per-connection server daemon (10.0.0.1:46528). Mar 21 12:33:10.296697 systemd-logind[1467]: Removed session 7. Mar 21 12:33:10.344867 sshd[1662]: Accepted publickey for core from 10.0.0.1 port 46528 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:33:10.345948 sshd-session[1662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:33:10.349272 systemd-logind[1467]: New session 8 of user core. Mar 21 12:33:10.366059 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 21 12:33:10.416871 sudo[1667]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 21 12:33:10.417202 sudo[1667]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 12:33:10.420558 sudo[1667]: pam_unix(sudo:session): session closed for user root Mar 21 12:33:10.425626 sudo[1666]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 21 12:33:10.425900 sudo[1666]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 12:33:10.434675 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 21 12:33:10.471176 augenrules[1689]: No rules Mar 21 12:33:10.472317 systemd[1]: audit-rules.service: Deactivated successfully. Mar 21 12:33:10.473945 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 21 12:33:10.474905 sudo[1666]: pam_unix(sudo:session): session closed for user root Mar 21 12:33:10.476267 sshd[1665]: Connection closed by 10.0.0.1 port 46528 Mar 21 12:33:10.476749 sshd-session[1662]: pam_unix(sshd:session): session closed for user core Mar 21 12:33:10.489206 systemd[1]: sshd@7-10.0.0.87:22-10.0.0.1:46528.service: Deactivated successfully. Mar 21 12:33:10.490737 systemd[1]: session-8.scope: Deactivated successfully. Mar 21 12:33:10.492257 systemd-logind[1467]: Session 8 logged out. Waiting for processes to exit. Mar 21 12:33:10.493595 systemd[1]: Started sshd@8-10.0.0.87:22-10.0.0.1:46542.service - OpenSSH per-connection server daemon (10.0.0.1:46542). Mar 21 12:33:10.495212 systemd-logind[1467]: Removed session 8. Mar 21 12:33:10.548478 sshd[1697]: Accepted publickey for core from 10.0.0.1 port 46542 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:33:10.549638 sshd-session[1697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:33:10.552880 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 21 12:33:10.554980 systemd-logind[1467]: New session 9 of user core. Mar 21 12:33:10.565143 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 21 12:33:10.566535 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 12:33:10.617583 sudo[1704]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 21 12:33:10.617869 sudo[1704]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 12:33:10.680853 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:10.684853 (kubelet)[1719]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 12:33:10.730024 kubelet[1719]: E0321 12:33:10.729962 1719 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 12:33:10.733196 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 12:33:10.733355 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 12:33:10.733868 systemd[1]: kubelet.service: Consumed 148ms CPU time, 105.2M memory peak. Mar 21 12:33:10.964050 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 21 12:33:10.980280 (dockerd)[1739]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 21 12:33:11.220783 dockerd[1739]: time="2025-03-21T12:33:11.220000534Z" level=info msg="Starting up" Mar 21 12:33:11.221259 dockerd[1739]: time="2025-03-21T12:33:11.221233814Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 21 12:33:11.325477 dockerd[1739]: time="2025-03-21T12:33:11.325424374Z" level=info msg="Loading containers: start." Mar 21 12:33:11.482951 kernel: Initializing XFRM netlink socket Mar 21 12:33:11.547225 systemd-networkd[1399]: docker0: Link UP Mar 21 12:33:11.629323 dockerd[1739]: time="2025-03-21T12:33:11.629267934Z" level=info msg="Loading containers: done." Mar 21 12:33:11.646597 dockerd[1739]: time="2025-03-21T12:33:11.646538894Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 21 12:33:11.646746 dockerd[1739]: time="2025-03-21T12:33:11.646653534Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 21 12:33:11.646863 dockerd[1739]: time="2025-03-21T12:33:11.646842334Z" level=info msg="Daemon has completed initialization" Mar 21 12:33:11.675940 dockerd[1739]: time="2025-03-21T12:33:11.675870974Z" level=info msg="API listen on /run/docker.sock" Mar 21 12:33:11.676056 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 21 12:33:12.248467 containerd[1483]: time="2025-03-21T12:33:12.248422414Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\"" Mar 21 12:33:12.893244 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3624892810.mount: Deactivated successfully. Mar 21 12:33:14.049565 containerd[1483]: time="2025-03-21T12:33:14.049516694Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:14.050585 containerd[1483]: time="2025-03-21T12:33:14.050451094Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.3: active requests=0, bytes read=26231952" Mar 21 12:33:14.051346 containerd[1483]: time="2025-03-21T12:33:14.051303694Z" level=info msg="ImageCreate event name:\"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:14.053607 containerd[1483]: time="2025-03-21T12:33:14.053566854Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:14.054605 containerd[1483]: time="2025-03-21T12:33:14.054568014Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.3\" with image id \"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\", size \"26228750\" in 1.80609976s" Mar 21 12:33:14.054652 containerd[1483]: time="2025-03-21T12:33:14.054606414Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\" returns image reference \"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\"" Mar 21 12:33:14.055176 containerd[1483]: time="2025-03-21T12:33:14.055153814Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\"" Mar 21 12:33:15.400307 containerd[1483]: time="2025-03-21T12:33:15.400255334Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:15.400844 containerd[1483]: time="2025-03-21T12:33:15.400788134Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.3: active requests=0, bytes read=22530034" Mar 21 12:33:15.401551 containerd[1483]: time="2025-03-21T12:33:15.401475414Z" level=info msg="ImageCreate event name:\"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:15.404357 containerd[1483]: time="2025-03-21T12:33:15.404287254Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:15.405266 containerd[1483]: time="2025-03-21T12:33:15.405230574Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.3\" with image id \"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\", size \"23970828\" in 1.3499776s" Mar 21 12:33:15.405266 containerd[1483]: time="2025-03-21T12:33:15.405267934Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\" returns image reference \"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\"" Mar 21 12:33:15.406035 containerd[1483]: time="2025-03-21T12:33:15.406011294Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\"" Mar 21 12:33:16.618680 containerd[1483]: time="2025-03-21T12:33:16.618620214Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:16.619043 containerd[1483]: time="2025-03-21T12:33:16.618982334Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.3: active requests=0, bytes read=17482563" Mar 21 12:33:16.619897 containerd[1483]: time="2025-03-21T12:33:16.619850814Z" level=info msg="ImageCreate event name:\"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:16.622128 containerd[1483]: time="2025-03-21T12:33:16.622095534Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:16.623091 containerd[1483]: time="2025-03-21T12:33:16.623033094Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.3\" with image id \"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\", size \"18923375\" in 1.2169914s" Mar 21 12:33:16.623091 containerd[1483]: time="2025-03-21T12:33:16.623071294Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\" returns image reference \"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\"" Mar 21 12:33:16.623624 containerd[1483]: time="2025-03-21T12:33:16.623563814Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 21 12:33:17.656153 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3181906579.mount: Deactivated successfully. Mar 21 12:33:17.989781 containerd[1483]: time="2025-03-21T12:33:17.989621494Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:17.990611 containerd[1483]: time="2025-03-21T12:33:17.990561054Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=27370097" Mar 21 12:33:17.991249 containerd[1483]: time="2025-03-21T12:33:17.991214214Z" level=info msg="ImageCreate event name:\"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:17.993271 containerd[1483]: time="2025-03-21T12:33:17.993234334Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:17.994109 containerd[1483]: time="2025-03-21T12:33:17.994076214Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"27369114\" in 1.37047932s" Mar 21 12:33:17.994146 containerd[1483]: time="2025-03-21T12:33:17.994111454Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\"" Mar 21 12:33:17.994565 containerd[1483]: time="2025-03-21T12:33:17.994541054Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Mar 21 12:33:18.535987 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3514553161.mount: Deactivated successfully. Mar 21 12:33:19.390403 containerd[1483]: time="2025-03-21T12:33:19.390342054Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:19.391086 containerd[1483]: time="2025-03-21T12:33:19.391029494Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Mar 21 12:33:19.391725 containerd[1483]: time="2025-03-21T12:33:19.391689014Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:19.394110 containerd[1483]: time="2025-03-21T12:33:19.394082814Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:19.396008 containerd[1483]: time="2025-03-21T12:33:19.395971774Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.401394s" Mar 21 12:33:19.396070 containerd[1483]: time="2025-03-21T12:33:19.396009054Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Mar 21 12:33:19.396461 containerd[1483]: time="2025-03-21T12:33:19.396421654Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 21 12:33:19.920276 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2139204077.mount: Deactivated successfully. Mar 21 12:33:19.923729 containerd[1483]: time="2025-03-21T12:33:19.923684814Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 12:33:19.924980 containerd[1483]: time="2025-03-21T12:33:19.924931454Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Mar 21 12:33:19.926057 containerd[1483]: time="2025-03-21T12:33:19.926005254Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 12:33:19.928090 containerd[1483]: time="2025-03-21T12:33:19.928051414Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 12:33:19.929326 containerd[1483]: time="2025-03-21T12:33:19.929292614Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 532.82532ms" Mar 21 12:33:19.929357 containerd[1483]: time="2025-03-21T12:33:19.929328374Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 21 12:33:19.929755 containerd[1483]: time="2025-03-21T12:33:19.929725214Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Mar 21 12:33:20.489394 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2572783021.mount: Deactivated successfully. Mar 21 12:33:20.983752 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 21 12:33:20.985177 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 12:33:21.102242 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:21.105135 (kubelet)[2128]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 12:33:21.146869 kubelet[2128]: E0321 12:33:21.146810 2128 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 12:33:21.149226 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 12:33:21.149371 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 12:33:21.149697 systemd[1]: kubelet.service: Consumed 131ms CPU time, 105.2M memory peak. Mar 21 12:33:22.985304 containerd[1483]: time="2025-03-21T12:33:22.985244694Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:22.986194 containerd[1483]: time="2025-03-21T12:33:22.986083054Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812431" Mar 21 12:33:22.986807 containerd[1483]: time="2025-03-21T12:33:22.986764934Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:22.989496 containerd[1483]: time="2025-03-21T12:33:22.989467014Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:22.990701 containerd[1483]: time="2025-03-21T12:33:22.990612534Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 3.06085412s" Mar 21 12:33:22.990701 containerd[1483]: time="2025-03-21T12:33:22.990649014Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Mar 21 12:33:28.691996 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:28.692571 systemd[1]: kubelet.service: Consumed 131ms CPU time, 105.2M memory peak. Mar 21 12:33:28.694568 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 12:33:28.718983 systemd[1]: Reload requested from client PID 2172 ('systemctl') (unit session-9.scope)... Mar 21 12:33:28.719000 systemd[1]: Reloading... Mar 21 12:33:28.800333 zram_generator::config[2215]: No configuration found. Mar 21 12:33:28.920281 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 12:33:28.992006 systemd[1]: Reloading finished in 272 ms. Mar 21 12:33:29.053164 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 21 12:33:29.053235 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 21 12:33:29.054940 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:29.054985 systemd[1]: kubelet.service: Consumed 86ms CPU time, 90.2M memory peak. Mar 21 12:33:29.057011 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 12:33:29.171698 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:29.175245 (kubelet)[2261]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 21 12:33:29.207551 kubelet[2261]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 12:33:29.207551 kubelet[2261]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 21 12:33:29.207551 kubelet[2261]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 12:33:29.207823 kubelet[2261]: I0321 12:33:29.207660 2261 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 12:33:29.917718 kubelet[2261]: I0321 12:33:29.917678 2261 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 21 12:33:29.917718 kubelet[2261]: I0321 12:33:29.917708 2261 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 12:33:29.917963 kubelet[2261]: I0321 12:33:29.917942 2261 server.go:954] "Client rotation is on, will bootstrap in background" Mar 21 12:33:29.938472 kubelet[2261]: E0321 12:33:29.938446 2261 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.87:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.87:6443: connect: connection refused" logger="UnhandledError" Mar 21 12:33:29.939254 kubelet[2261]: I0321 12:33:29.939241 2261 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 21 12:33:29.949899 kubelet[2261]: I0321 12:33:29.949875 2261 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 12:33:29.952971 kubelet[2261]: I0321 12:33:29.952947 2261 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 21 12:33:29.953186 kubelet[2261]: I0321 12:33:29.953153 2261 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 12:33:29.953331 kubelet[2261]: I0321 12:33:29.953178 2261 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 12:33:29.953429 kubelet[2261]: I0321 12:33:29.953406 2261 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 12:33:29.953429 kubelet[2261]: I0321 12:33:29.953417 2261 container_manager_linux.go:304] "Creating device plugin manager" Mar 21 12:33:29.953619 kubelet[2261]: I0321 12:33:29.953594 2261 state_mem.go:36] "Initialized new in-memory state store" Mar 21 12:33:29.955902 kubelet[2261]: I0321 12:33:29.955872 2261 kubelet.go:446] "Attempting to sync node with API server" Mar 21 12:33:29.955902 kubelet[2261]: I0321 12:33:29.955896 2261 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 12:33:29.956985 kubelet[2261]: I0321 12:33:29.956905 2261 kubelet.go:352] "Adding apiserver pod source" Mar 21 12:33:29.956985 kubelet[2261]: I0321 12:33:29.956939 2261 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 12:33:29.960462 kubelet[2261]: W0321 12:33:29.960350 2261 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.87:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.87:6443: connect: connection refused Mar 21 12:33:29.960462 kubelet[2261]: W0321 12:33:29.960361 2261 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.87:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.87:6443: connect: connection refused Mar 21 12:33:29.960462 kubelet[2261]: E0321 12:33:29.960421 2261 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.87:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.87:6443: connect: connection refused" logger="UnhandledError" Mar 21 12:33:29.960462 kubelet[2261]: E0321 12:33:29.960425 2261 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.87:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.87:6443: connect: connection refused" logger="UnhandledError" Mar 21 12:33:29.962606 kubelet[2261]: I0321 12:33:29.962571 2261 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 21 12:33:29.963227 kubelet[2261]: I0321 12:33:29.963201 2261 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 12:33:29.963382 kubelet[2261]: W0321 12:33:29.963369 2261 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 21 12:33:29.964335 kubelet[2261]: I0321 12:33:29.964208 2261 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 21 12:33:29.964335 kubelet[2261]: I0321 12:33:29.964246 2261 server.go:1287] "Started kubelet" Mar 21 12:33:29.964335 kubelet[2261]: I0321 12:33:29.964289 2261 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 12:33:29.965380 kubelet[2261]: I0321 12:33:29.965181 2261 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 12:33:29.965494 kubelet[2261]: I0321 12:33:29.965475 2261 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 12:33:29.965536 kubelet[2261]: I0321 12:33:29.965237 2261 server.go:490] "Adding debug handlers to kubelet server" Mar 21 12:33:29.966226 kubelet[2261]: I0321 12:33:29.966062 2261 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 12:33:29.966573 kubelet[2261]: I0321 12:33:29.966545 2261 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 21 12:33:29.967309 kubelet[2261]: E0321 12:33:29.967063 2261 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.87:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.87:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182ed175db939896 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-21 12:33:29.964222614 +0000 UTC m=+0.786091361,LastTimestamp:2025-03-21 12:33:29.964222614 +0000 UTC m=+0.786091361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 21 12:33:29.967512 kubelet[2261]: I0321 12:33:29.967453 2261 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 21 12:33:29.967572 kubelet[2261]: I0321 12:33:29.967554 2261 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 21 12:33:29.967615 kubelet[2261]: I0321 12:33:29.967603 2261 reconciler.go:26] "Reconciler: start to sync state" Mar 21 12:33:29.967887 kubelet[2261]: E0321 12:33:29.967657 2261 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 21 12:33:29.967887 kubelet[2261]: W0321 12:33:29.967840 2261 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.87:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.87:6443: connect: connection refused Mar 21 12:33:29.968132 kubelet[2261]: E0321 12:33:29.968110 2261 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.87:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.87:6443: connect: connection refused" logger="UnhandledError" Mar 21 12:33:29.969381 kubelet[2261]: E0321 12:33:29.968216 2261 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 21 12:33:29.969933 kubelet[2261]: I0321 12:33:29.968550 2261 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 21 12:33:29.970109 kubelet[2261]: E0321 12:33:29.968822 2261 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.87:6443: connect: connection refused" interval="200ms" Mar 21 12:33:29.971310 kubelet[2261]: I0321 12:33:29.971292 2261 factory.go:221] Registration of the containerd container factory successfully Mar 21 12:33:29.971381 kubelet[2261]: I0321 12:33:29.971372 2261 factory.go:221] Registration of the systemd container factory successfully Mar 21 12:33:29.981904 kubelet[2261]: I0321 12:33:29.981880 2261 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 21 12:33:29.981904 kubelet[2261]: I0321 12:33:29.981895 2261 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 21 12:33:29.981904 kubelet[2261]: I0321 12:33:29.981911 2261 state_mem.go:36] "Initialized new in-memory state store" Mar 21 12:33:29.982967 kubelet[2261]: I0321 12:33:29.982834 2261 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 12:33:29.983961 kubelet[2261]: I0321 12:33:29.983893 2261 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 12:33:29.984027 kubelet[2261]: I0321 12:33:29.984017 2261 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 21 12:33:29.984097 kubelet[2261]: I0321 12:33:29.984086 2261 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 21 12:33:29.984152 kubelet[2261]: I0321 12:33:29.984143 2261 kubelet.go:2388] "Starting kubelet main sync loop" Mar 21 12:33:29.984232 kubelet[2261]: E0321 12:33:29.984218 2261 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 12:33:29.984845 kubelet[2261]: W0321 12:33:29.984658 2261 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.87:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.87:6443: connect: connection refused Mar 21 12:33:29.984845 kubelet[2261]: E0321 12:33:29.984701 2261 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.87:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.87:6443: connect: connection refused" logger="UnhandledError" Mar 21 12:33:30.050539 kubelet[2261]: I0321 12:33:30.050507 2261 policy_none.go:49] "None policy: Start" Mar 21 12:33:30.050539 kubelet[2261]: I0321 12:33:30.050539 2261 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 21 12:33:30.050539 kubelet[2261]: I0321 12:33:30.050553 2261 state_mem.go:35] "Initializing new in-memory state store" Mar 21 12:33:30.055709 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 21 12:33:30.067855 kubelet[2261]: E0321 12:33:30.067828 2261 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 21 12:33:30.068437 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 21 12:33:30.084463 kubelet[2261]: E0321 12:33:30.084434 2261 kubelet.go:2412] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 21 12:33:30.086103 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 21 12:33:30.087243 kubelet[2261]: I0321 12:33:30.087221 2261 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 12:33:30.087815 kubelet[2261]: I0321 12:33:30.087382 2261 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 12:33:30.087815 kubelet[2261]: I0321 12:33:30.087409 2261 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 12:33:30.087815 kubelet[2261]: I0321 12:33:30.087764 2261 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 12:33:30.088698 kubelet[2261]: E0321 12:33:30.088680 2261 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 21 12:33:30.088773 kubelet[2261]: E0321 12:33:30.088713 2261 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 21 12:33:30.170676 kubelet[2261]: E0321 12:33:30.170571 2261 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.87:6443: connect: connection refused" interval="400ms" Mar 21 12:33:30.188649 kubelet[2261]: I0321 12:33:30.188607 2261 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 21 12:33:30.188972 kubelet[2261]: E0321 12:33:30.188948 2261 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.87:6443/api/v1/nodes\": dial tcp 10.0.0.87:6443: connect: connection refused" node="localhost" Mar 21 12:33:30.291336 systemd[1]: Created slice kubepods-burstable-podcbbb394ff48414687df77e1bc213eeb5.slice - libcontainer container kubepods-burstable-podcbbb394ff48414687df77e1bc213eeb5.slice. Mar 21 12:33:30.318326 kubelet[2261]: E0321 12:33:30.318202 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:30.321006 systemd[1]: Created slice kubepods-burstable-pod3700e556aa2777679a324159272023f1.slice - libcontainer container kubepods-burstable-pod3700e556aa2777679a324159272023f1.slice. Mar 21 12:33:30.331991 kubelet[2261]: E0321 12:33:30.331956 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:30.334669 systemd[1]: Created slice kubepods-burstable-pod9358feba7050c46c4389e1ebd9242717.slice - libcontainer container kubepods-burstable-pod9358feba7050c46c4389e1ebd9242717.slice. Mar 21 12:33:30.336066 kubelet[2261]: E0321 12:33:30.336038 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:30.369355 kubelet[2261]: I0321 12:33:30.369294 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9358feba7050c46c4389e1ebd9242717-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"9358feba7050c46c4389e1ebd9242717\") " pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:30.369355 kubelet[2261]: I0321 12:33:30.369329 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:30.369355 kubelet[2261]: I0321 12:33:30.369346 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:30.369503 kubelet[2261]: I0321 12:33:30.369363 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:30.369503 kubelet[2261]: I0321 12:33:30.369378 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:30.369503 kubelet[2261]: I0321 12:33:30.369448 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3700e556aa2777679a324159272023f1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"3700e556aa2777679a324159272023f1\") " pod="kube-system/kube-scheduler-localhost" Mar 21 12:33:30.369503 kubelet[2261]: I0321 12:33:30.369487 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9358feba7050c46c4389e1ebd9242717-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"9358feba7050c46c4389e1ebd9242717\") " pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:30.369587 kubelet[2261]: I0321 12:33:30.369516 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9358feba7050c46c4389e1ebd9242717-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"9358feba7050c46c4389e1ebd9242717\") " pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:30.369587 kubelet[2261]: I0321 12:33:30.369532 2261 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:30.390125 kubelet[2261]: I0321 12:33:30.390105 2261 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 21 12:33:30.390413 kubelet[2261]: E0321 12:33:30.390379 2261 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.87:6443/api/v1/nodes\": dial tcp 10.0.0.87:6443: connect: connection refused" node="localhost" Mar 21 12:33:30.571032 kubelet[2261]: E0321 12:33:30.570937 2261 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.87:6443: connect: connection refused" interval="800ms" Mar 21 12:33:30.619148 kubelet[2261]: E0321 12:33:30.619120 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.619745 containerd[1483]: time="2025-03-21T12:33:30.619704094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:cbbb394ff48414687df77e1bc213eeb5,Namespace:kube-system,Attempt:0,}" Mar 21 12:33:30.633305 kubelet[2261]: E0321 12:33:30.633260 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.633625 containerd[1483]: time="2025-03-21T12:33:30.633583654Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:3700e556aa2777679a324159272023f1,Namespace:kube-system,Attempt:0,}" Mar 21 12:33:30.635720 containerd[1483]: time="2025-03-21T12:33:30.635657174Z" level=info msg="connecting to shim 138bf3c8d46cf9a68c0ef41981bfaaf2f66bde36f45a6eb94245a1a6163e25f3" address="unix:///run/containerd/s/a55e60787df7b52aa7f1e8a04baf7c25f0c30b07a5ce15febaab1c944ef84fe4" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:33:30.637160 kubelet[2261]: E0321 12:33:30.637131 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.637484 containerd[1483]: time="2025-03-21T12:33:30.637440454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:9358feba7050c46c4389e1ebd9242717,Namespace:kube-system,Attempt:0,}" Mar 21 12:33:30.656732 containerd[1483]: time="2025-03-21T12:33:30.656096414Z" level=info msg="connecting to shim e9eeb5b472dadca3cc6984bc078bf952e9d4a6984ffa74cb6e3cb0b3cb5a42e5" address="unix:///run/containerd/s/eed61cdab4ddf2f50157faf40bb0965348d4b80a17e16a5ef5d69f9dbffe44b0" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:33:30.661090 systemd[1]: Started cri-containerd-138bf3c8d46cf9a68c0ef41981bfaaf2f66bde36f45a6eb94245a1a6163e25f3.scope - libcontainer container 138bf3c8d46cf9a68c0ef41981bfaaf2f66bde36f45a6eb94245a1a6163e25f3. Mar 21 12:33:30.670213 containerd[1483]: time="2025-03-21T12:33:30.670157534Z" level=info msg="connecting to shim 1028e19c882c0b354ae736bb7ccb641b8b67831dc46587d0afc9e785acc8d02c" address="unix:///run/containerd/s/51b83232b0926657eaa2dbc7aac775aee6776a4ca0748668bd3d842924c4f24f" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:33:30.689080 systemd[1]: Started cri-containerd-e9eeb5b472dadca3cc6984bc078bf952e9d4a6984ffa74cb6e3cb0b3cb5a42e5.scope - libcontainer container e9eeb5b472dadca3cc6984bc078bf952e9d4a6984ffa74cb6e3cb0b3cb5a42e5. Mar 21 12:33:30.692016 systemd[1]: Started cri-containerd-1028e19c882c0b354ae736bb7ccb641b8b67831dc46587d0afc9e785acc8d02c.scope - libcontainer container 1028e19c882c0b354ae736bb7ccb641b8b67831dc46587d0afc9e785acc8d02c. Mar 21 12:33:30.702824 containerd[1483]: time="2025-03-21T12:33:30.702784254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:cbbb394ff48414687df77e1bc213eeb5,Namespace:kube-system,Attempt:0,} returns sandbox id \"138bf3c8d46cf9a68c0ef41981bfaaf2f66bde36f45a6eb94245a1a6163e25f3\"" Mar 21 12:33:30.704319 kubelet[2261]: E0321 12:33:30.704248 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.708542 containerd[1483]: time="2025-03-21T12:33:30.708501214Z" level=info msg="CreateContainer within sandbox \"138bf3c8d46cf9a68c0ef41981bfaaf2f66bde36f45a6eb94245a1a6163e25f3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 21 12:33:30.717727 containerd[1483]: time="2025-03-21T12:33:30.717680134Z" level=info msg="Container e61892f2f92a15a48dbfe38895de0c0c09a320ef834bfa0d082de823cb08b046: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:30.724810 containerd[1483]: time="2025-03-21T12:33:30.724774694Z" level=info msg="CreateContainer within sandbox \"138bf3c8d46cf9a68c0ef41981bfaaf2f66bde36f45a6eb94245a1a6163e25f3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e61892f2f92a15a48dbfe38895de0c0c09a320ef834bfa0d082de823cb08b046\"" Mar 21 12:33:30.725552 containerd[1483]: time="2025-03-21T12:33:30.725527814Z" level=info msg="StartContainer for \"e61892f2f92a15a48dbfe38895de0c0c09a320ef834bfa0d082de823cb08b046\"" Mar 21 12:33:30.726765 containerd[1483]: time="2025-03-21T12:33:30.726733134Z" level=info msg="connecting to shim e61892f2f92a15a48dbfe38895de0c0c09a320ef834bfa0d082de823cb08b046" address="unix:///run/containerd/s/a55e60787df7b52aa7f1e8a04baf7c25f0c30b07a5ce15febaab1c944ef84fe4" protocol=ttrpc version=3 Mar 21 12:33:30.727467 containerd[1483]: time="2025-03-21T12:33:30.727438934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:3700e556aa2777679a324159272023f1,Namespace:kube-system,Attempt:0,} returns sandbox id \"e9eeb5b472dadca3cc6984bc078bf952e9d4a6984ffa74cb6e3cb0b3cb5a42e5\"" Mar 21 12:33:30.728157 kubelet[2261]: E0321 12:33:30.728130 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.731346 containerd[1483]: time="2025-03-21T12:33:30.731289174Z" level=info msg="CreateContainer within sandbox \"e9eeb5b472dadca3cc6984bc078bf952e9d4a6984ffa74cb6e3cb0b3cb5a42e5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 21 12:33:30.733697 containerd[1483]: time="2025-03-21T12:33:30.733668254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:9358feba7050c46c4389e1ebd9242717,Namespace:kube-system,Attempt:0,} returns sandbox id \"1028e19c882c0b354ae736bb7ccb641b8b67831dc46587d0afc9e785acc8d02c\"" Mar 21 12:33:30.734324 kubelet[2261]: E0321 12:33:30.734299 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.736443 containerd[1483]: time="2025-03-21T12:33:30.736386734Z" level=info msg="CreateContainer within sandbox \"1028e19c882c0b354ae736bb7ccb641b8b67831dc46587d0afc9e785acc8d02c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 21 12:33:30.739480 containerd[1483]: time="2025-03-21T12:33:30.739431894Z" level=info msg="Container 7044bd64566d571d11fdabeaffcbab07ab942bffa282ec59735bb4ffbac888fe: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:30.744379 containerd[1483]: time="2025-03-21T12:33:30.744351414Z" level=info msg="Container fde55c634946fe280cc8a938f7947719f858fc97ab34b1839325b2518a9529d4: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:30.747518 containerd[1483]: time="2025-03-21T12:33:30.747478454Z" level=info msg="CreateContainer within sandbox \"e9eeb5b472dadca3cc6984bc078bf952e9d4a6984ffa74cb6e3cb0b3cb5a42e5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"7044bd64566d571d11fdabeaffcbab07ab942bffa282ec59735bb4ffbac888fe\"" Mar 21 12:33:30.747851 containerd[1483]: time="2025-03-21T12:33:30.747815774Z" level=info msg="StartContainer for \"7044bd64566d571d11fdabeaffcbab07ab942bffa282ec59735bb4ffbac888fe\"" Mar 21 12:33:30.748857 containerd[1483]: time="2025-03-21T12:33:30.748831214Z" level=info msg="connecting to shim 7044bd64566d571d11fdabeaffcbab07ab942bffa282ec59735bb4ffbac888fe" address="unix:///run/containerd/s/eed61cdab4ddf2f50157faf40bb0965348d4b80a17e16a5ef5d69f9dbffe44b0" protocol=ttrpc version=3 Mar 21 12:33:30.749367 systemd[1]: Started cri-containerd-e61892f2f92a15a48dbfe38895de0c0c09a320ef834bfa0d082de823cb08b046.scope - libcontainer container e61892f2f92a15a48dbfe38895de0c0c09a320ef834bfa0d082de823cb08b046. Mar 21 12:33:30.749703 containerd[1483]: time="2025-03-21T12:33:30.749536334Z" level=info msg="CreateContainer within sandbox \"1028e19c882c0b354ae736bb7ccb641b8b67831dc46587d0afc9e785acc8d02c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fde55c634946fe280cc8a938f7947719f858fc97ab34b1839325b2518a9529d4\"" Mar 21 12:33:30.750116 containerd[1483]: time="2025-03-21T12:33:30.750023294Z" level=info msg="StartContainer for \"fde55c634946fe280cc8a938f7947719f858fc97ab34b1839325b2518a9529d4\"" Mar 21 12:33:30.751503 containerd[1483]: time="2025-03-21T12:33:30.751417974Z" level=info msg="connecting to shim fde55c634946fe280cc8a938f7947719f858fc97ab34b1839325b2518a9529d4" address="unix:///run/containerd/s/51b83232b0926657eaa2dbc7aac775aee6776a4ca0748668bd3d842924c4f24f" protocol=ttrpc version=3 Mar 21 12:33:30.765054 systemd[1]: Started cri-containerd-7044bd64566d571d11fdabeaffcbab07ab942bffa282ec59735bb4ffbac888fe.scope - libcontainer container 7044bd64566d571d11fdabeaffcbab07ab942bffa282ec59735bb4ffbac888fe. Mar 21 12:33:30.768483 systemd[1]: Started cri-containerd-fde55c634946fe280cc8a938f7947719f858fc97ab34b1839325b2518a9529d4.scope - libcontainer container fde55c634946fe280cc8a938f7947719f858fc97ab34b1839325b2518a9529d4. Mar 21 12:33:30.791665 kubelet[2261]: I0321 12:33:30.791435 2261 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 21 12:33:30.791830 kubelet[2261]: E0321 12:33:30.791752 2261 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.87:6443/api/v1/nodes\": dial tcp 10.0.0.87:6443: connect: connection refused" node="localhost" Mar 21 12:33:30.805835 containerd[1483]: time="2025-03-21T12:33:30.805765974Z" level=info msg="StartContainer for \"7044bd64566d571d11fdabeaffcbab07ab942bffa282ec59735bb4ffbac888fe\" returns successfully" Mar 21 12:33:30.812883 containerd[1483]: time="2025-03-21T12:33:30.812802614Z" level=info msg="StartContainer for \"fde55c634946fe280cc8a938f7947719f858fc97ab34b1839325b2518a9529d4\" returns successfully" Mar 21 12:33:30.815113 containerd[1483]: time="2025-03-21T12:33:30.815082614Z" level=info msg="StartContainer for \"e61892f2f92a15a48dbfe38895de0c0c09a320ef834bfa0d082de823cb08b046\" returns successfully" Mar 21 12:33:30.992598 kubelet[2261]: E0321 12:33:30.992216 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:30.992598 kubelet[2261]: E0321 12:33:30.992369 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.998011 kubelet[2261]: E0321 12:33:30.996949 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:30.998206 kubelet[2261]: E0321 12:33:30.998180 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:30.999013 kubelet[2261]: E0321 12:33:30.998826 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:30.999013 kubelet[2261]: E0321 12:33:30.998937 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:31.592842 kubelet[2261]: I0321 12:33:31.592811 2261 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 21 12:33:32.000141 kubelet[2261]: E0321 12:33:31.999994 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:32.000141 kubelet[2261]: E0321 12:33:32.000132 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:32.000223 kubelet[2261]: E0321 12:33:32.000168 2261 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 21 12:33:32.000526 kubelet[2261]: E0321 12:33:32.000261 2261 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:33.468320 kubelet[2261]: E0321 12:33:33.468272 2261 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 21 12:33:33.558045 kubelet[2261]: I0321 12:33:33.558013 2261 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Mar 21 12:33:33.569642 kubelet[2261]: I0321 12:33:33.569613 2261 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:33.627512 kubelet[2261]: E0321 12:33:33.626430 2261 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:33.627512 kubelet[2261]: I0321 12:33:33.626458 2261 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:33.628679 kubelet[2261]: E0321 12:33:33.628648 2261 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:33.628679 kubelet[2261]: I0321 12:33:33.628674 2261 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 21 12:33:33.630449 kubelet[2261]: E0321 12:33:33.630411 2261 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Mar 21 12:33:33.963279 kubelet[2261]: I0321 12:33:33.962853 2261 apiserver.go:52] "Watching apiserver" Mar 21 12:33:33.968081 kubelet[2261]: I0321 12:33:33.968053 2261 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 21 12:33:35.621452 systemd[1]: Reload requested from client PID 2536 ('systemctl') (unit session-9.scope)... Mar 21 12:33:35.621467 systemd[1]: Reloading... Mar 21 12:33:35.696953 zram_generator::config[2583]: No configuration found. Mar 21 12:33:35.776135 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 12:33:35.858817 systemd[1]: Reloading finished in 237 ms. Mar 21 12:33:35.879023 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 12:33:35.891782 systemd[1]: kubelet.service: Deactivated successfully. Mar 21 12:33:35.893257 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:35.893313 systemd[1]: kubelet.service: Consumed 1.148s CPU time, 122.9M memory peak. Mar 21 12:33:35.895045 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 12:33:36.012595 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 12:33:36.016658 (kubelet)[2622]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 21 12:33:36.052672 kubelet[2622]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 12:33:36.052672 kubelet[2622]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 21 12:33:36.052672 kubelet[2622]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 12:33:36.053165 kubelet[2622]: I0321 12:33:36.053118 2622 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 12:33:36.058483 kubelet[2622]: I0321 12:33:36.058447 2622 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 21 12:33:36.059943 kubelet[2622]: I0321 12:33:36.058580 2622 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 12:33:36.059943 kubelet[2622]: I0321 12:33:36.058819 2622 server.go:954] "Client rotation is on, will bootstrap in background" Mar 21 12:33:36.060140 kubelet[2622]: I0321 12:33:36.060122 2622 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 12:33:36.063195 kubelet[2622]: I0321 12:33:36.063159 2622 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 21 12:33:36.068273 kubelet[2622]: I0321 12:33:36.068253 2622 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 21 12:33:36.070906 kubelet[2622]: I0321 12:33:36.070883 2622 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 21 12:33:36.071198 kubelet[2622]: I0321 12:33:36.071168 2622 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 12:33:36.071438 kubelet[2622]: I0321 12:33:36.071265 2622 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 21 12:33:36.071564 kubelet[2622]: I0321 12:33:36.071552 2622 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 12:33:36.071615 kubelet[2622]: I0321 12:33:36.071608 2622 container_manager_linux.go:304] "Creating device plugin manager" Mar 21 12:33:36.071711 kubelet[2622]: I0321 12:33:36.071702 2622 state_mem.go:36] "Initialized new in-memory state store" Mar 21 12:33:36.071906 kubelet[2622]: I0321 12:33:36.071893 2622 kubelet.go:446] "Attempting to sync node with API server" Mar 21 12:33:36.072009 kubelet[2622]: I0321 12:33:36.071996 2622 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 12:33:36.072077 kubelet[2622]: I0321 12:33:36.072069 2622 kubelet.go:352] "Adding apiserver pod source" Mar 21 12:33:36.072128 kubelet[2622]: I0321 12:33:36.072120 2622 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 12:33:36.073027 kubelet[2622]: I0321 12:33:36.073002 2622 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 21 12:33:36.073853 kubelet[2622]: I0321 12:33:36.073836 2622 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 12:33:36.074884 kubelet[2622]: I0321 12:33:36.074865 2622 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 21 12:33:36.075041 kubelet[2622]: I0321 12:33:36.075029 2622 server.go:1287] "Started kubelet" Mar 21 12:33:36.075216 kubelet[2622]: I0321 12:33:36.075176 2622 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 12:33:36.075683 kubelet[2622]: I0321 12:33:36.075620 2622 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 12:33:36.076072 kubelet[2622]: I0321 12:33:36.076034 2622 server.go:490] "Adding debug handlers to kubelet server" Mar 21 12:33:36.076340 kubelet[2622]: I0321 12:33:36.076323 2622 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 12:33:36.079467 kubelet[2622]: E0321 12:33:36.079444 2622 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 21 12:33:36.079685 kubelet[2622]: I0321 12:33:36.079673 2622 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 12:33:36.080057 kubelet[2622]: I0321 12:33:36.080042 2622 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 21 12:33:36.080206 kubelet[2622]: I0321 12:33:36.080191 2622 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 21 12:33:36.084453 kubelet[2622]: I0321 12:33:36.084422 2622 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 21 12:33:36.088196 kubelet[2622]: I0321 12:33:36.088168 2622 reconciler.go:26] "Reconciler: start to sync state" Mar 21 12:33:36.088595 kubelet[2622]: E0321 12:33:36.088561 2622 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 21 12:33:36.098458 kubelet[2622]: I0321 12:33:36.094979 2622 factory.go:221] Registration of the systemd container factory successfully Mar 21 12:33:36.098458 kubelet[2622]: I0321 12:33:36.095092 2622 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 21 12:33:36.100580 kubelet[2622]: I0321 12:33:36.100351 2622 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 12:33:36.100682 kubelet[2622]: I0321 12:33:36.100602 2622 factory.go:221] Registration of the containerd container factory successfully Mar 21 12:33:36.102246 kubelet[2622]: I0321 12:33:36.102218 2622 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 12:33:36.102246 kubelet[2622]: I0321 12:33:36.102243 2622 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 21 12:33:36.102338 kubelet[2622]: I0321 12:33:36.102261 2622 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 21 12:33:36.102338 kubelet[2622]: I0321 12:33:36.102270 2622 kubelet.go:2388] "Starting kubelet main sync loop" Mar 21 12:33:36.102338 kubelet[2622]: E0321 12:33:36.102308 2622 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 12:33:36.132469 kubelet[2622]: I0321 12:33:36.132381 2622 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 21 12:33:36.132595 kubelet[2622]: I0321 12:33:36.132579 2622 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 21 12:33:36.132654 kubelet[2622]: I0321 12:33:36.132646 2622 state_mem.go:36] "Initialized new in-memory state store" Mar 21 12:33:36.132851 kubelet[2622]: I0321 12:33:36.132834 2622 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 21 12:33:36.133135 kubelet[2622]: I0321 12:33:36.133100 2622 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 21 12:33:36.133359 kubelet[2622]: I0321 12:33:36.133344 2622 policy_none.go:49] "None policy: Start" Mar 21 12:33:36.133592 kubelet[2622]: I0321 12:33:36.133571 2622 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 21 12:33:36.133735 kubelet[2622]: I0321 12:33:36.133722 2622 state_mem.go:35] "Initializing new in-memory state store" Mar 21 12:33:36.133944 kubelet[2622]: I0321 12:33:36.133913 2622 state_mem.go:75] "Updated machine memory state" Mar 21 12:33:36.137484 kubelet[2622]: I0321 12:33:36.137461 2622 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 12:33:36.137721 kubelet[2622]: I0321 12:33:36.137706 2622 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 21 12:33:36.137805 kubelet[2622]: I0321 12:33:36.137776 2622 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 12:33:36.138067 kubelet[2622]: I0321 12:33:36.138043 2622 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 12:33:36.138929 kubelet[2622]: E0321 12:33:36.138883 2622 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 21 12:33:36.203411 kubelet[2622]: I0321 12:33:36.203351 2622 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 21 12:33:36.203541 kubelet[2622]: I0321 12:33:36.203507 2622 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:36.203820 kubelet[2622]: I0321 12:33:36.203785 2622 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:36.241458 kubelet[2622]: I0321 12:33:36.241418 2622 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 21 12:33:36.248256 kubelet[2622]: I0321 12:33:36.248221 2622 kubelet_node_status.go:125] "Node was previously registered" node="localhost" Mar 21 12:33:36.248421 kubelet[2622]: I0321 12:33:36.248324 2622 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Mar 21 12:33:36.289010 kubelet[2622]: I0321 12:33:36.288952 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9358feba7050c46c4389e1ebd9242717-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"9358feba7050c46c4389e1ebd9242717\") " pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:36.289136 kubelet[2622]: I0321 12:33:36.289015 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3700e556aa2777679a324159272023f1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"3700e556aa2777679a324159272023f1\") " pod="kube-system/kube-scheduler-localhost" Mar 21 12:33:36.289136 kubelet[2622]: I0321 12:33:36.289050 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9358feba7050c46c4389e1ebd9242717-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"9358feba7050c46c4389e1ebd9242717\") " pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:36.289136 kubelet[2622]: I0321 12:33:36.289076 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9358feba7050c46c4389e1ebd9242717-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"9358feba7050c46c4389e1ebd9242717\") " pod="kube-system/kube-apiserver-localhost" Mar 21 12:33:36.289136 kubelet[2622]: I0321 12:33:36.289093 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:36.289136 kubelet[2622]: I0321 12:33:36.289107 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:36.289242 kubelet[2622]: I0321 12:33:36.289120 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:36.289242 kubelet[2622]: I0321 12:33:36.289135 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:36.289242 kubelet[2622]: I0321 12:33:36.289159 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 21 12:33:36.509625 kubelet[2622]: E0321 12:33:36.509504 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:36.509727 kubelet[2622]: E0321 12:33:36.509681 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:36.510484 kubelet[2622]: E0321 12:33:36.510456 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:37.072368 kubelet[2622]: I0321 12:33:37.072337 2622 apiserver.go:52] "Watching apiserver" Mar 21 12:33:37.085120 kubelet[2622]: I0321 12:33:37.085087 2622 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 21 12:33:37.135884 kubelet[2622]: E0321 12:33:37.135844 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:37.136132 kubelet[2622]: I0321 12:33:37.136101 2622 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 21 12:33:37.138936 kubelet[2622]: E0321 12:33:37.136540 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:37.142139 kubelet[2622]: E0321 12:33:37.142116 2622 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 21 12:33:37.142254 kubelet[2622]: E0321 12:33:37.142235 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:37.167462 kubelet[2622]: I0321 12:33:37.167402 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.167377292 podStartE2EDuration="1.167377292s" podCreationTimestamp="2025-03-21 12:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 12:33:37.158580344 +0000 UTC m=+1.137251231" watchObservedRunningTime="2025-03-21 12:33:37.167377292 +0000 UTC m=+1.146048139" Mar 21 12:33:37.167578 kubelet[2622]: I0321 12:33:37.167492 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.167487893 podStartE2EDuration="1.167487893s" podCreationTimestamp="2025-03-21 12:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 12:33:37.16676713 +0000 UTC m=+1.145437977" watchObservedRunningTime="2025-03-21 12:33:37.167487893 +0000 UTC m=+1.146158700" Mar 21 12:33:38.139289 kubelet[2622]: E0321 12:33:38.138368 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:38.140817 kubelet[2622]: E0321 12:33:38.139775 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:39.139321 kubelet[2622]: E0321 12:33:39.139290 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:40.150498 kubelet[2622]: E0321 12:33:40.150410 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:40.165799 kubelet[2622]: I0321 12:33:40.165748 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=4.165736015 podStartE2EDuration="4.165736015s" podCreationTimestamp="2025-03-21 12:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 12:33:37.174537515 +0000 UTC m=+1.153208402" watchObservedRunningTime="2025-03-21 12:33:40.165736015 +0000 UTC m=+4.144406862" Mar 21 12:33:40.758612 sudo[1704]: pam_unix(sudo:session): session closed for user root Mar 21 12:33:40.767314 sshd[1701]: Connection closed by 10.0.0.1 port 46542 Mar 21 12:33:40.767741 sshd-session[1697]: pam_unix(sshd:session): session closed for user core Mar 21 12:33:40.770546 systemd[1]: sshd@8-10.0.0.87:22-10.0.0.1:46542.service: Deactivated successfully. Mar 21 12:33:40.772376 systemd[1]: session-9.scope: Deactivated successfully. Mar 21 12:33:40.772566 systemd[1]: session-9.scope: Consumed 7.772s CPU time, 222.6M memory peak. Mar 21 12:33:40.774251 systemd-logind[1467]: Session 9 logged out. Waiting for processes to exit. Mar 21 12:33:40.775325 systemd-logind[1467]: Removed session 9. Mar 21 12:33:41.140911 kubelet[2622]: E0321 12:33:41.140763 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:41.670903 kubelet[2622]: I0321 12:33:41.670867 2622 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 21 12:33:41.671661 containerd[1483]: time="2025-03-21T12:33:41.671556695Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 21 12:33:41.672633 kubelet[2622]: I0321 12:33:41.671768 2622 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 21 12:33:42.142242 kubelet[2622]: E0321 12:33:42.142109 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:42.418987 systemd[1]: Created slice kubepods-besteffort-podde5a83af_3ab3_42bb_9d21_2f6f1e60ae37.slice - libcontainer container kubepods-besteffort-podde5a83af_3ab3_42bb_9d21_2f6f1e60ae37.slice. Mar 21 12:33:42.439374 kubelet[2622]: I0321 12:33:42.439345 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/de5a83af-3ab3-42bb-9d21-2f6f1e60ae37-kube-proxy\") pod \"kube-proxy-57wc2\" (UID: \"de5a83af-3ab3-42bb-9d21-2f6f1e60ae37\") " pod="kube-system/kube-proxy-57wc2" Mar 21 12:33:42.439459 kubelet[2622]: I0321 12:33:42.439384 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/de5a83af-3ab3-42bb-9d21-2f6f1e60ae37-xtables-lock\") pod \"kube-proxy-57wc2\" (UID: \"de5a83af-3ab3-42bb-9d21-2f6f1e60ae37\") " pod="kube-system/kube-proxy-57wc2" Mar 21 12:33:42.439459 kubelet[2622]: I0321 12:33:42.439411 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de5a83af-3ab3-42bb-9d21-2f6f1e60ae37-lib-modules\") pod \"kube-proxy-57wc2\" (UID: \"de5a83af-3ab3-42bb-9d21-2f6f1e60ae37\") " pod="kube-system/kube-proxy-57wc2" Mar 21 12:33:42.439459 kubelet[2622]: I0321 12:33:42.439427 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c28sm\" (UniqueName: \"kubernetes.io/projected/de5a83af-3ab3-42bb-9d21-2f6f1e60ae37-kube-api-access-c28sm\") pod \"kube-proxy-57wc2\" (UID: \"de5a83af-3ab3-42bb-9d21-2f6f1e60ae37\") " pod="kube-system/kube-proxy-57wc2" Mar 21 12:33:42.729794 kubelet[2622]: E0321 12:33:42.729681 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:42.733243 containerd[1483]: time="2025-03-21T12:33:42.733202940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-57wc2,Uid:de5a83af-3ab3-42bb-9d21-2f6f1e60ae37,Namespace:kube-system,Attempt:0,}" Mar 21 12:33:42.771237 containerd[1483]: time="2025-03-21T12:33:42.771168389Z" level=info msg="connecting to shim 9e3ec05ba14c85c7423f9a51caa8003c1737bec9fad95a1484fb6e1b376dde28" address="unix:///run/containerd/s/1c01afe9d6428ac1fb8ccb18f74720c992467e38c8735d00d61c20abac689dcc" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:33:42.822119 systemd[1]: Started cri-containerd-9e3ec05ba14c85c7423f9a51caa8003c1737bec9fad95a1484fb6e1b376dde28.scope - libcontainer container 9e3ec05ba14c85c7423f9a51caa8003c1737bec9fad95a1484fb6e1b376dde28. Mar 21 12:33:42.853237 systemd[1]: Created slice kubepods-besteffort-pod96b6bf8e_cebc_4cbb_924a_508cf7108e8f.slice - libcontainer container kubepods-besteffort-pod96b6bf8e_cebc_4cbb_924a_508cf7108e8f.slice. Mar 21 12:33:42.867869 containerd[1483]: time="2025-03-21T12:33:42.867829774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-57wc2,Uid:de5a83af-3ab3-42bb-9d21-2f6f1e60ae37,Namespace:kube-system,Attempt:0,} returns sandbox id \"9e3ec05ba14c85c7423f9a51caa8003c1737bec9fad95a1484fb6e1b376dde28\"" Mar 21 12:33:42.871097 kubelet[2622]: E0321 12:33:42.871071 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:42.876612 containerd[1483]: time="2025-03-21T12:33:42.876570594Z" level=info msg="CreateContainer within sandbox \"9e3ec05ba14c85c7423f9a51caa8003c1737bec9fad95a1484fb6e1b376dde28\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 21 12:33:42.885570 containerd[1483]: time="2025-03-21T12:33:42.885191534Z" level=info msg="Container 5d1c4e5d83f88919b94b0122fcc6686116492b5cb52fd9e2bb85e5a0616d8599: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:42.912185 containerd[1483]: time="2025-03-21T12:33:42.912147077Z" level=info msg="CreateContainer within sandbox \"9e3ec05ba14c85c7423f9a51caa8003c1737bec9fad95a1484fb6e1b376dde28\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5d1c4e5d83f88919b94b0122fcc6686116492b5cb52fd9e2bb85e5a0616d8599\"" Mar 21 12:33:42.912815 containerd[1483]: time="2025-03-21T12:33:42.912786679Z" level=info msg="StartContainer for \"5d1c4e5d83f88919b94b0122fcc6686116492b5cb52fd9e2bb85e5a0616d8599\"" Mar 21 12:33:42.914089 containerd[1483]: time="2025-03-21T12:33:42.914065242Z" level=info msg="connecting to shim 5d1c4e5d83f88919b94b0122fcc6686116492b5cb52fd9e2bb85e5a0616d8599" address="unix:///run/containerd/s/1c01afe9d6428ac1fb8ccb18f74720c992467e38c8735d00d61c20abac689dcc" protocol=ttrpc version=3 Mar 21 12:33:42.932045 systemd[1]: Started cri-containerd-5d1c4e5d83f88919b94b0122fcc6686116492b5cb52fd9e2bb85e5a0616d8599.scope - libcontainer container 5d1c4e5d83f88919b94b0122fcc6686116492b5cb52fd9e2bb85e5a0616d8599. Mar 21 12:33:42.943754 kubelet[2622]: I0321 12:33:42.943723 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/96b6bf8e-cebc-4cbb-924a-508cf7108e8f-var-lib-calico\") pod \"tigera-operator-ccfc44587-69kqh\" (UID: \"96b6bf8e-cebc-4cbb-924a-508cf7108e8f\") " pod="tigera-operator/tigera-operator-ccfc44587-69kqh" Mar 21 12:33:42.943829 kubelet[2622]: I0321 12:33:42.943757 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtdpk\" (UniqueName: \"kubernetes.io/projected/96b6bf8e-cebc-4cbb-924a-508cf7108e8f-kube-api-access-wtdpk\") pod \"tigera-operator-ccfc44587-69kqh\" (UID: \"96b6bf8e-cebc-4cbb-924a-508cf7108e8f\") " pod="tigera-operator/tigera-operator-ccfc44587-69kqh" Mar 21 12:33:42.976196 containerd[1483]: time="2025-03-21T12:33:42.976125106Z" level=info msg="StartContainer for \"5d1c4e5d83f88919b94b0122fcc6686116492b5cb52fd9e2bb85e5a0616d8599\" returns successfully" Mar 21 12:33:43.147144 kubelet[2622]: E0321 12:33:43.147025 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:43.154427 kubelet[2622]: I0321 12:33:43.154365 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-57wc2" podStartSLOduration=1.154351219 podStartE2EDuration="1.154351219s" podCreationTimestamp="2025-03-21 12:33:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 12:33:43.153872818 +0000 UTC m=+7.132543625" watchObservedRunningTime="2025-03-21 12:33:43.154351219 +0000 UTC m=+7.133022066" Mar 21 12:33:43.157269 containerd[1483]: time="2025-03-21T12:33:43.156807064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-69kqh,Uid:96b6bf8e-cebc-4cbb-924a-508cf7108e8f,Namespace:tigera-operator,Attempt:0,}" Mar 21 12:33:43.178526 containerd[1483]: time="2025-03-21T12:33:43.178482792Z" level=info msg="connecting to shim 06561b111fb5c704b982370fff89290877c81734f61a41d4d9d57a37b1cf240a" address="unix:///run/containerd/s/3cb4f1a4a1bbc72a58644a5d2f206a337c74de3ad3ecf7bebe68503cc09282c7" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:33:43.201191 systemd[1]: Started cri-containerd-06561b111fb5c704b982370fff89290877c81734f61a41d4d9d57a37b1cf240a.scope - libcontainer container 06561b111fb5c704b982370fff89290877c81734f61a41d4d9d57a37b1cf240a. Mar 21 12:33:43.229131 containerd[1483]: time="2025-03-21T12:33:43.229095342Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-69kqh,Uid:96b6bf8e-cebc-4cbb-924a-508cf7108e8f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"06561b111fb5c704b982370fff89290877c81734f61a41d4d9d57a37b1cf240a\"" Mar 21 12:33:43.231501 containerd[1483]: time="2025-03-21T12:33:43.231473548Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 21 12:33:43.672964 update_engine[1469]: I20250321 12:33:43.672405 1469 update_attempter.cc:509] Updating boot flags... Mar 21 12:33:43.693949 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2972) Mar 21 12:33:43.728060 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2972) Mar 21 12:33:43.756025 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2972) Mar 21 12:33:44.554442 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2129742737.mount: Deactivated successfully. Mar 21 12:33:44.711965 kubelet[2622]: E0321 12:33:44.711936 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:44.963741 containerd[1483]: time="2025-03-21T12:33:44.963700840Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:44.964480 containerd[1483]: time="2025-03-21T12:33:44.964432001Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 21 12:33:44.965449 containerd[1483]: time="2025-03-21T12:33:44.965201043Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:44.967552 containerd[1483]: time="2025-03-21T12:33:44.967519208Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:44.968557 containerd[1483]: time="2025-03-21T12:33:44.968533370Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 1.737024422s" Mar 21 12:33:44.968657 containerd[1483]: time="2025-03-21T12:33:44.968641010Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 21 12:33:44.971684 containerd[1483]: time="2025-03-21T12:33:44.971650776Z" level=info msg="CreateContainer within sandbox \"06561b111fb5c704b982370fff89290877c81734f61a41d4d9d57a37b1cf240a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 21 12:33:44.977057 containerd[1483]: time="2025-03-21T12:33:44.977019787Z" level=info msg="Container 8c8df9af417899111540fe236b1d3bc2c2e7d3275d50e565991ced1cc93808e3: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:44.981545 containerd[1483]: time="2025-03-21T12:33:44.981509236Z" level=info msg="CreateContainer within sandbox \"06561b111fb5c704b982370fff89290877c81734f61a41d4d9d57a37b1cf240a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8c8df9af417899111540fe236b1d3bc2c2e7d3275d50e565991ced1cc93808e3\"" Mar 21 12:33:44.981873 containerd[1483]: time="2025-03-21T12:33:44.981851237Z" level=info msg="StartContainer for \"8c8df9af417899111540fe236b1d3bc2c2e7d3275d50e565991ced1cc93808e3\"" Mar 21 12:33:44.982664 containerd[1483]: time="2025-03-21T12:33:44.982522038Z" level=info msg="connecting to shim 8c8df9af417899111540fe236b1d3bc2c2e7d3275d50e565991ced1cc93808e3" address="unix:///run/containerd/s/3cb4f1a4a1bbc72a58644a5d2f206a337c74de3ad3ecf7bebe68503cc09282c7" protocol=ttrpc version=3 Mar 21 12:33:45.008065 systemd[1]: Started cri-containerd-8c8df9af417899111540fe236b1d3bc2c2e7d3275d50e565991ced1cc93808e3.scope - libcontainer container 8c8df9af417899111540fe236b1d3bc2c2e7d3275d50e565991ced1cc93808e3. Mar 21 12:33:45.034047 containerd[1483]: time="2025-03-21T12:33:45.032899297Z" level=info msg="StartContainer for \"8c8df9af417899111540fe236b1d3bc2c2e7d3275d50e565991ced1cc93808e3\" returns successfully" Mar 21 12:33:45.161295 kubelet[2622]: E0321 12:33:45.161260 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:46.115500 kubelet[2622]: I0321 12:33:46.115286 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-ccfc44587-69kqh" podStartSLOduration=2.374920693 podStartE2EDuration="4.115270082s" podCreationTimestamp="2025-03-21 12:33:42 +0000 UTC" firstStartedPulling="2025-03-21 12:33:43.229976344 +0000 UTC m=+7.208647191" lastFinishedPulling="2025-03-21 12:33:44.970325733 +0000 UTC m=+8.948996580" observedRunningTime="2025-03-21 12:33:45.177525175 +0000 UTC m=+9.156196022" watchObservedRunningTime="2025-03-21 12:33:46.115270082 +0000 UTC m=+10.093940929" Mar 21 12:33:46.606862 kubelet[2622]: E0321 12:33:46.606821 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:47.161341 kubelet[2622]: E0321 12:33:47.161316 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:49.249163 systemd[1]: Created slice kubepods-besteffort-pod4c8c1b0d_2e3d_4e17_9025_86cc4e8c9091.slice - libcontainer container kubepods-besteffort-pod4c8c1b0d_2e3d_4e17_9025_86cc4e8c9091.slice. Mar 21 12:33:49.283877 kubelet[2622]: I0321 12:33:49.283826 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091-tigera-ca-bundle\") pod \"calico-typha-547fbb5d8b-fb2t2\" (UID: \"4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091\") " pod="calico-system/calico-typha-547fbb5d8b-fb2t2" Mar 21 12:33:49.283877 kubelet[2622]: I0321 12:33:49.283878 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpf6l\" (UniqueName: \"kubernetes.io/projected/4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091-kube-api-access-qpf6l\") pod \"calico-typha-547fbb5d8b-fb2t2\" (UID: \"4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091\") " pod="calico-system/calico-typha-547fbb5d8b-fb2t2" Mar 21 12:33:49.284241 kubelet[2622]: I0321 12:33:49.283896 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091-typha-certs\") pod \"calico-typha-547fbb5d8b-fb2t2\" (UID: \"4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091\") " pod="calico-system/calico-typha-547fbb5d8b-fb2t2" Mar 21 12:33:49.446950 systemd[1]: Created slice kubepods-besteffort-pod6c14fe44_f685_45b9_ae8b_c214a3d69959.slice - libcontainer container kubepods-besteffort-pod6c14fe44_f685_45b9_ae8b_c214a3d69959.slice. Mar 21 12:33:49.485143 kubelet[2622]: I0321 12:33:49.485077 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-var-run-calico\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485143 kubelet[2622]: I0321 12:33:49.485123 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-lib-modules\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485143 kubelet[2622]: I0321 12:33:49.485139 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-policysync\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485143 kubelet[2622]: I0321 12:33:49.485154 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c14fe44-f685-45b9-ae8b-c214a3d69959-tigera-ca-bundle\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485391 kubelet[2622]: I0321 12:33:49.485168 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-cni-net-dir\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485391 kubelet[2622]: I0321 12:33:49.485182 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-cni-log-dir\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485391 kubelet[2622]: I0321 12:33:49.485200 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6c14fe44-f685-45b9-ae8b-c214a3d69959-node-certs\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485391 kubelet[2622]: I0321 12:33:49.485216 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-xtables-lock\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485391 kubelet[2622]: I0321 12:33:49.485234 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdnpj\" (UniqueName: \"kubernetes.io/projected/6c14fe44-f685-45b9-ae8b-c214a3d69959-kube-api-access-vdnpj\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485511 kubelet[2622]: I0321 12:33:49.485275 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-var-lib-calico\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485511 kubelet[2622]: I0321 12:33:49.485314 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-cni-bin-dir\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.485511 kubelet[2622]: I0321 12:33:49.485345 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6c14fe44-f685-45b9-ae8b-c214a3d69959-flexvol-driver-host\") pod \"calico-node-x2vl6\" (UID: \"6c14fe44-f685-45b9-ae8b-c214a3d69959\") " pod="calico-system/calico-node-x2vl6" Mar 21 12:33:49.561690 kubelet[2622]: E0321 12:33:49.561206 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:49.561780 containerd[1483]: time="2025-03-21T12:33:49.561701457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-547fbb5d8b-fb2t2,Uid:4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091,Namespace:calico-system,Attempt:0,}" Mar 21 12:33:49.581361 containerd[1483]: time="2025-03-21T12:33:49.581325286Z" level=info msg="connecting to shim 38d6db62c2bac0eb63c7bd3177191c0bcc8a7caf4fd97cf5fc41bdb2eed9b3d0" address="unix:///run/containerd/s/df440433e58ae2cf267f9f2f136a363f3bc3f64621373ee8817d0e6776f13c9f" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:33:49.601183 kubelet[2622]: E0321 12:33:49.601075 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.601183 kubelet[2622]: W0321 12:33:49.601098 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.601183 kubelet[2622]: E0321 12:33:49.601144 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.603191 kubelet[2622]: E0321 12:33:49.603176 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.603191 kubelet[2622]: W0321 12:33:49.603189 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.603314 kubelet[2622]: E0321 12:33:49.603299 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.603485 kubelet[2622]: E0321 12:33:49.603470 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.603485 kubelet[2622]: W0321 12:33:49.603484 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.603548 kubelet[2622]: E0321 12:33:49.603495 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.623001 kubelet[2622]: E0321 12:33:49.622792 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x5fd2" podUID="819cfe7b-19b8-4e1c-86f0-166cf2f4d650" Mar 21 12:33:49.628725 systemd[1]: Started cri-containerd-38d6db62c2bac0eb63c7bd3177191c0bcc8a7caf4fd97cf5fc41bdb2eed9b3d0.scope - libcontainer container 38d6db62c2bac0eb63c7bd3177191c0bcc8a7caf4fd97cf5fc41bdb2eed9b3d0. Mar 21 12:33:49.663322 kubelet[2622]: E0321 12:33:49.663076 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.663322 kubelet[2622]: W0321 12:33:49.663100 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.663322 kubelet[2622]: E0321 12:33:49.663128 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.663706 kubelet[2622]: E0321 12:33:49.663549 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.663706 kubelet[2622]: W0321 12:33:49.663562 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.663706 kubelet[2622]: E0321 12:33:49.663611 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.664173 kubelet[2622]: E0321 12:33:49.664021 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.664173 kubelet[2622]: W0321 12:33:49.664034 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.664173 kubelet[2622]: E0321 12:33:49.664072 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.664481 kubelet[2622]: E0321 12:33:49.664396 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.664481 kubelet[2622]: W0321 12:33:49.664409 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.664481 kubelet[2622]: E0321 12:33:49.664420 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.664858 containerd[1483]: time="2025-03-21T12:33:49.664700250Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-547fbb5d8b-fb2t2,Uid:4c8c1b0d-2e3d-4e17-9025-86cc4e8c9091,Namespace:calico-system,Attempt:0,} returns sandbox id \"38d6db62c2bac0eb63c7bd3177191c0bcc8a7caf4fd97cf5fc41bdb2eed9b3d0\"" Mar 21 12:33:49.665032 kubelet[2622]: E0321 12:33:49.664759 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.665032 kubelet[2622]: W0321 12:33:49.664771 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.665032 kubelet[2622]: E0321 12:33:49.664782 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.665266 kubelet[2622]: E0321 12:33:49.665183 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.665266 kubelet[2622]: W0321 12:33:49.665196 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.665266 kubelet[2622]: E0321 12:33:49.665208 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.665675 kubelet[2622]: E0321 12:33:49.665574 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.665675 kubelet[2622]: W0321 12:33:49.665586 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.665675 kubelet[2622]: E0321 12:33:49.665598 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.665951 kubelet[2622]: E0321 12:33:49.665848 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.665951 kubelet[2622]: W0321 12:33:49.665859 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.665951 kubelet[2622]: E0321 12:33:49.665870 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.666221 kubelet[2622]: E0321 12:33:49.666208 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.666325 kubelet[2622]: W0321 12:33:49.666270 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.666325 kubelet[2622]: E0321 12:33:49.666286 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.666400 kubelet[2622]: E0321 12:33:49.666339 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:49.670685 kubelet[2622]: E0321 12:33:49.667280 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.670685 kubelet[2622]: W0321 12:33:49.667290 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.670685 kubelet[2622]: E0321 12:33:49.667319 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.670685 kubelet[2622]: E0321 12:33:49.667531 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.670685 kubelet[2622]: W0321 12:33:49.667540 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.670685 kubelet[2622]: E0321 12:33:49.667549 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.670685 kubelet[2622]: E0321 12:33:49.667721 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.670685 kubelet[2622]: W0321 12:33:49.667734 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.670685 kubelet[2622]: E0321 12:33:49.667742 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.670685 kubelet[2622]: E0321 12:33:49.667896 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.670941 containerd[1483]: time="2025-03-21T12:33:49.667308414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 21 12:33:49.670979 kubelet[2622]: W0321 12:33:49.667906 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.670979 kubelet[2622]: E0321 12:33:49.667914 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.670979 kubelet[2622]: E0321 12:33:49.668101 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.670979 kubelet[2622]: W0321 12:33:49.668108 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.670979 kubelet[2622]: E0321 12:33:49.668116 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.670979 kubelet[2622]: E0321 12:33:49.668266 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.670979 kubelet[2622]: W0321 12:33:49.668273 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.670979 kubelet[2622]: E0321 12:33:49.668280 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.670979 kubelet[2622]: E0321 12:33:49.670693 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.670979 kubelet[2622]: W0321 12:33:49.670711 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.671173 kubelet[2622]: E0321 12:33:49.670721 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.671173 kubelet[2622]: E0321 12:33:49.670949 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.671173 kubelet[2622]: W0321 12:33:49.670959 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.671173 kubelet[2622]: E0321 12:33:49.670968 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.671173 kubelet[2622]: E0321 12:33:49.671132 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.671173 kubelet[2622]: W0321 12:33:49.671141 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.671173 kubelet[2622]: E0321 12:33:49.671150 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.671469 kubelet[2622]: E0321 12:33:49.671355 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.671469 kubelet[2622]: W0321 12:33:49.671367 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.671469 kubelet[2622]: E0321 12:33:49.671375 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.671585 kubelet[2622]: E0321 12:33:49.671547 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.671585 kubelet[2622]: W0321 12:33:49.671575 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.671585 kubelet[2622]: E0321 12:33:49.671585 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.688979 kubelet[2622]: E0321 12:33:49.688956 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.688979 kubelet[2622]: W0321 12:33:49.688974 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.689253 kubelet[2622]: E0321 12:33:49.688986 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.689253 kubelet[2622]: I0321 12:33:49.689012 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/819cfe7b-19b8-4e1c-86f0-166cf2f4d650-kubelet-dir\") pod \"csi-node-driver-x5fd2\" (UID: \"819cfe7b-19b8-4e1c-86f0-166cf2f4d650\") " pod="calico-system/csi-node-driver-x5fd2" Mar 21 12:33:49.689253 kubelet[2622]: E0321 12:33:49.689244 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.689253 kubelet[2622]: W0321 12:33:49.689256 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.689346 kubelet[2622]: E0321 12:33:49.689271 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.689346 kubelet[2622]: I0321 12:33:49.689287 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/819cfe7b-19b8-4e1c-86f0-166cf2f4d650-varrun\") pod \"csi-node-driver-x5fd2\" (UID: \"819cfe7b-19b8-4e1c-86f0-166cf2f4d650\") " pod="calico-system/csi-node-driver-x5fd2" Mar 21 12:33:49.689729 kubelet[2622]: E0321 12:33:49.689703 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.689729 kubelet[2622]: W0321 12:33:49.689719 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.689822 kubelet[2622]: E0321 12:33:49.689736 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.689822 kubelet[2622]: I0321 12:33:49.689755 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/819cfe7b-19b8-4e1c-86f0-166cf2f4d650-socket-dir\") pod \"csi-node-driver-x5fd2\" (UID: \"819cfe7b-19b8-4e1c-86f0-166cf2f4d650\") " pod="calico-system/csi-node-driver-x5fd2" Mar 21 12:33:49.690253 kubelet[2622]: E0321 12:33:49.690231 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.690253 kubelet[2622]: W0321 12:33:49.690248 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.690786 kubelet[2622]: E0321 12:33:49.690465 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.691155 kubelet[2622]: E0321 12:33:49.691140 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.691155 kubelet[2622]: W0321 12:33:49.691154 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.691224 kubelet[2622]: E0321 12:33:49.691201 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.691899 kubelet[2622]: E0321 12:33:49.691869 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.691899 kubelet[2622]: W0321 12:33:49.691885 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.691976 kubelet[2622]: E0321 12:33:49.691924 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.692551 kubelet[2622]: E0321 12:33:49.692525 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.692588 kubelet[2622]: W0321 12:33:49.692558 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.692639 kubelet[2622]: E0321 12:33:49.692602 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.692724 kubelet[2622]: I0321 12:33:49.692706 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qz88\" (UniqueName: \"kubernetes.io/projected/819cfe7b-19b8-4e1c-86f0-166cf2f4d650-kube-api-access-6qz88\") pod \"csi-node-driver-x5fd2\" (UID: \"819cfe7b-19b8-4e1c-86f0-166cf2f4d650\") " pod="calico-system/csi-node-driver-x5fd2" Mar 21 12:33:49.692898 kubelet[2622]: E0321 12:33:49.692882 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.692963 kubelet[2622]: W0321 12:33:49.692910 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.692996 kubelet[2622]: E0321 12:33:49.692978 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.693120 kubelet[2622]: E0321 12:33:49.693107 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.693120 kubelet[2622]: W0321 12:33:49.693117 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.693170 kubelet[2622]: E0321 12:33:49.693127 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.693516 kubelet[2622]: E0321 12:33:49.693499 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.693516 kubelet[2622]: W0321 12:33:49.693513 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.693637 kubelet[2622]: E0321 12:33:49.693528 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.693802 kubelet[2622]: E0321 12:33:49.693787 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.693802 kubelet[2622]: W0321 12:33:49.693799 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.693874 kubelet[2622]: E0321 12:33:49.693808 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.694028 kubelet[2622]: E0321 12:33:49.694016 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.694028 kubelet[2622]: W0321 12:33:49.694026 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.694130 kubelet[2622]: E0321 12:33:49.694034 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.694228 kubelet[2622]: E0321 12:33:49.694209 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.694228 kubelet[2622]: W0321 12:33:49.694219 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.694228 kubelet[2622]: E0321 12:33:49.694227 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.694303 kubelet[2622]: I0321 12:33:49.694242 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/819cfe7b-19b8-4e1c-86f0-166cf2f4d650-registration-dir\") pod \"csi-node-driver-x5fd2\" (UID: \"819cfe7b-19b8-4e1c-86f0-166cf2f4d650\") " pod="calico-system/csi-node-driver-x5fd2" Mar 21 12:33:49.694944 kubelet[2622]: E0321 12:33:49.694850 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.694944 kubelet[2622]: W0321 12:33:49.694867 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.694944 kubelet[2622]: E0321 12:33:49.694879 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.695556 kubelet[2622]: E0321 12:33:49.695528 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.695556 kubelet[2622]: W0321 12:33:49.695542 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.695556 kubelet[2622]: E0321 12:33:49.695553 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.749692 kubelet[2622]: E0321 12:33:49.749653 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:49.750242 containerd[1483]: time="2025-03-21T12:33:49.750113137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x2vl6,Uid:6c14fe44-f685-45b9-ae8b-c214a3d69959,Namespace:calico-system,Attempt:0,}" Mar 21 12:33:49.771110 containerd[1483]: time="2025-03-21T12:33:49.771058968Z" level=info msg="connecting to shim e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f" address="unix:///run/containerd/s/0a22203a1761ee7ce291d3e407337f9de945df5972c31e98a9914889464e8f36" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:33:49.792089 systemd[1]: Started cri-containerd-e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f.scope - libcontainer container e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f. Mar 21 12:33:49.795354 kubelet[2622]: E0321 12:33:49.795329 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.795536 kubelet[2622]: W0321 12:33:49.795514 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.795636 kubelet[2622]: E0321 12:33:49.795617 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.796055 kubelet[2622]: E0321 12:33:49.795976 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.796169 kubelet[2622]: W0321 12:33:49.796153 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.796249 kubelet[2622]: E0321 12:33:49.796238 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.796542 kubelet[2622]: E0321 12:33:49.796524 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.796593 kubelet[2622]: W0321 12:33:49.796579 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.796751 kubelet[2622]: E0321 12:33:49.796606 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.796871 kubelet[2622]: E0321 12:33:49.796847 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.796967 kubelet[2622]: W0321 12:33:49.796948 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.797117 kubelet[2622]: E0321 12:33:49.797035 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.797458 kubelet[2622]: E0321 12:33:49.797443 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.798649 kubelet[2622]: W0321 12:33:49.797520 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.798649 kubelet[2622]: E0321 12:33:49.797639 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.798649 kubelet[2622]: E0321 12:33:49.797765 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.798649 kubelet[2622]: W0321 12:33:49.797774 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.798649 kubelet[2622]: E0321 12:33:49.797803 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.798649 kubelet[2622]: E0321 12:33:49.798027 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.798649 kubelet[2622]: W0321 12:33:49.798038 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.798649 kubelet[2622]: E0321 12:33:49.798052 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.798649 kubelet[2622]: E0321 12:33:49.798246 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.798649 kubelet[2622]: W0321 12:33:49.798254 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.798875 kubelet[2622]: E0321 12:33:49.798267 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.798875 kubelet[2622]: E0321 12:33:49.798457 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.798875 kubelet[2622]: W0321 12:33:49.798481 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.798875 kubelet[2622]: E0321 12:33:49.798497 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.799148 kubelet[2622]: E0321 12:33:49.799130 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.799148 kubelet[2622]: W0321 12:33:49.799147 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.799226 kubelet[2622]: E0321 12:33:49.799166 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.799425 kubelet[2622]: E0321 12:33:49.799411 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.799425 kubelet[2622]: W0321 12:33:49.799425 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.799486 kubelet[2622]: E0321 12:33:49.799474 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.799688 kubelet[2622]: E0321 12:33:49.799673 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.799688 kubelet[2622]: W0321 12:33:49.799686 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.799778 kubelet[2622]: E0321 12:33:49.799730 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.799923 kubelet[2622]: E0321 12:33:49.799905 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.800020 kubelet[2622]: W0321 12:33:49.799924 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.800020 kubelet[2622]: E0321 12:33:49.799957 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.800124 kubelet[2622]: E0321 12:33:49.800112 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.800124 kubelet[2622]: W0321 12:33:49.800122 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.800198 kubelet[2622]: E0321 12:33:49.800135 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.800332 kubelet[2622]: E0321 12:33:49.800297 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.800332 kubelet[2622]: W0321 12:33:49.800308 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.800332 kubelet[2622]: E0321 12:33:49.800316 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.800493 kubelet[2622]: E0321 12:33:49.800481 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.800493 kubelet[2622]: W0321 12:33:49.800491 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.800562 kubelet[2622]: E0321 12:33:49.800504 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.800703 kubelet[2622]: E0321 12:33:49.800692 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.800703 kubelet[2622]: W0321 12:33:49.800703 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.800787 kubelet[2622]: E0321 12:33:49.800773 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.800855 kubelet[2622]: E0321 12:33:49.800845 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.800855 kubelet[2622]: W0321 12:33:49.800854 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.800992 kubelet[2622]: E0321 12:33:49.800910 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.801093 kubelet[2622]: E0321 12:33:49.800999 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.801093 kubelet[2622]: W0321 12:33:49.801072 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.801206 kubelet[2622]: E0321 12:33:49.801189 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.801498 kubelet[2622]: E0321 12:33:49.801428 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.801498 kubelet[2622]: W0321 12:33:49.801441 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.801498 kubelet[2622]: E0321 12:33:49.801485 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.801790 kubelet[2622]: E0321 12:33:49.801777 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.801965 kubelet[2622]: W0321 12:33:49.801839 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.801965 kubelet[2622]: E0321 12:33:49.801861 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.802878 kubelet[2622]: E0321 12:33:49.802092 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.802878 kubelet[2622]: W0321 12:33:49.802102 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.802878 kubelet[2622]: E0321 12:33:49.802116 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.802878 kubelet[2622]: E0321 12:33:49.802343 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.802878 kubelet[2622]: W0321 12:33:49.802352 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.802878 kubelet[2622]: E0321 12:33:49.802365 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.802878 kubelet[2622]: E0321 12:33:49.802559 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.802878 kubelet[2622]: W0321 12:33:49.802568 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.802878 kubelet[2622]: E0321 12:33:49.802581 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.803644 kubelet[2622]: E0321 12:33:49.803139 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.803644 kubelet[2622]: W0321 12:33:49.803153 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.803644 kubelet[2622]: E0321 12:33:49.803164 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.816198 kubelet[2622]: E0321 12:33:49.816014 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:49.816198 kubelet[2622]: W0321 12:33:49.816039 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:49.816198 kubelet[2622]: E0321 12:33:49.816084 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:49.862349 containerd[1483]: time="2025-03-21T12:33:49.862298863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x2vl6,Uid:6c14fe44-f685-45b9-ae8b-c214a3d69959,Namespace:calico-system,Attempt:0,} returns sandbox id \"e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f\"" Mar 21 12:33:49.863084 kubelet[2622]: E0321 12:33:49.863057 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:51.102900 kubelet[2622]: E0321 12:33:51.102831 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x5fd2" podUID="819cfe7b-19b8-4e1c-86f0-166cf2f4d650" Mar 21 12:33:51.973932 containerd[1483]: time="2025-03-21T12:33:51.973883927Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:51.974880 containerd[1483]: time="2025-03-21T12:33:51.974688888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 21 12:33:51.975707 containerd[1483]: time="2025-03-21T12:33:51.975636249Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:51.978037 containerd[1483]: time="2025-03-21T12:33:51.977971812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:51.978702 containerd[1483]: time="2025-03-21T12:33:51.978626693Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 2.311291079s" Mar 21 12:33:51.978702 containerd[1483]: time="2025-03-21T12:33:51.978666773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 21 12:33:51.979743 containerd[1483]: time="2025-03-21T12:33:51.979546254Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 21 12:33:51.992322 containerd[1483]: time="2025-03-21T12:33:51.991530870Z" level=info msg="CreateContainer within sandbox \"38d6db62c2bac0eb63c7bd3177191c0bcc8a7caf4fd97cf5fc41bdb2eed9b3d0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 21 12:33:51.996904 containerd[1483]: time="2025-03-21T12:33:51.996866357Z" level=info msg="Container e3c08869c1a4d1b808cc523ef64636a7d0f56b43341edf5c08113291a593f4b7: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:52.008326 containerd[1483]: time="2025-03-21T12:33:52.008282851Z" level=info msg="CreateContainer within sandbox \"38d6db62c2bac0eb63c7bd3177191c0bcc8a7caf4fd97cf5fc41bdb2eed9b3d0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e3c08869c1a4d1b808cc523ef64636a7d0f56b43341edf5c08113291a593f4b7\"" Mar 21 12:33:52.009978 containerd[1483]: time="2025-03-21T12:33:52.009056332Z" level=info msg="StartContainer for \"e3c08869c1a4d1b808cc523ef64636a7d0f56b43341edf5c08113291a593f4b7\"" Mar 21 12:33:52.010210 containerd[1483]: time="2025-03-21T12:33:52.010107933Z" level=info msg="connecting to shim e3c08869c1a4d1b808cc523ef64636a7d0f56b43341edf5c08113291a593f4b7" address="unix:///run/containerd/s/df440433e58ae2cf267f9f2f136a363f3bc3f64621373ee8817d0e6776f13c9f" protocol=ttrpc version=3 Mar 21 12:33:52.032079 systemd[1]: Started cri-containerd-e3c08869c1a4d1b808cc523ef64636a7d0f56b43341edf5c08113291a593f4b7.scope - libcontainer container e3c08869c1a4d1b808cc523ef64636a7d0f56b43341edf5c08113291a593f4b7. Mar 21 12:33:52.069555 containerd[1483]: time="2025-03-21T12:33:52.069515166Z" level=info msg="StartContainer for \"e3c08869c1a4d1b808cc523ef64636a7d0f56b43341edf5c08113291a593f4b7\" returns successfully" Mar 21 12:33:52.178334 kubelet[2622]: E0321 12:33:52.178305 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:52.187980 kubelet[2622]: E0321 12:33:52.187395 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.187980 kubelet[2622]: W0321 12:33:52.187417 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.187980 kubelet[2622]: E0321 12:33:52.187436 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.188345 kubelet[2622]: E0321 12:33:52.188197 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.188345 kubelet[2622]: W0321 12:33:52.188211 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.188345 kubelet[2622]: E0321 12:33:52.188223 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.189008 kubelet[2622]: E0321 12:33:52.188993 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.189204 kubelet[2622]: W0321 12:33:52.189084 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.189204 kubelet[2622]: E0321 12:33:52.189103 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.190624 kubelet[2622]: E0321 12:33:52.189339 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.190949 kubelet[2622]: W0321 12:33:52.190806 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.190949 kubelet[2622]: E0321 12:33:52.190827 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.191161 kubelet[2622]: E0321 12:33:52.191094 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.191161 kubelet[2622]: W0321 12:33:52.191105 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.191161 kubelet[2622]: E0321 12:33:52.191116 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.191561 kubelet[2622]: E0321 12:33:52.191445 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.191561 kubelet[2622]: W0321 12:33:52.191458 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.191561 kubelet[2622]: E0321 12:33:52.191468 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.191711 kubelet[2622]: E0321 12:33:52.191701 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.191806 kubelet[2622]: W0321 12:33:52.191755 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.191806 kubelet[2622]: E0321 12:33:52.191768 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.192165 kubelet[2622]: E0321 12:33:52.192076 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.192165 kubelet[2622]: W0321 12:33:52.192090 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.192165 kubelet[2622]: E0321 12:33:52.192108 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.192559 kubelet[2622]: E0321 12:33:52.192502 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.193059 kubelet[2622]: W0321 12:33:52.193041 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.193592 kubelet[2622]: E0321 12:33:52.193148 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.194008 kubelet[2622]: E0321 12:33:52.193992 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.194097 kubelet[2622]: W0321 12:33:52.194085 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.194157 kubelet[2622]: E0321 12:33:52.194145 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.194426 kubelet[2622]: E0321 12:33:52.194413 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.194549 kubelet[2622]: W0321 12:33:52.194491 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.194549 kubelet[2622]: E0321 12:33:52.194507 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.194879 kubelet[2622]: E0321 12:33:52.194784 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.194879 kubelet[2622]: W0321 12:33:52.194795 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.194879 kubelet[2622]: E0321 12:33:52.194805 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.195431 kubelet[2622]: E0321 12:33:52.195047 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.195585 kubelet[2622]: W0321 12:33:52.195501 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.195585 kubelet[2622]: E0321 12:33:52.195521 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.195887 kubelet[2622]: E0321 12:33:52.195875 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.196018 kubelet[2622]: W0321 12:33:52.195963 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.196018 kubelet[2622]: E0321 12:33:52.195980 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.196370 kubelet[2622]: E0321 12:33:52.196289 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.196370 kubelet[2622]: W0321 12:33:52.196301 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.196370 kubelet[2622]: E0321 12:33:52.196311 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.219229 kubelet[2622]: E0321 12:33:52.219188 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.219229 kubelet[2622]: W0321 12:33:52.219213 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.219229 kubelet[2622]: E0321 12:33:52.219236 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.219509 kubelet[2622]: E0321 12:33:52.219489 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.219549 kubelet[2622]: W0321 12:33:52.219532 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.219582 kubelet[2622]: E0321 12:33:52.219552 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.219816 kubelet[2622]: E0321 12:33:52.219779 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.219862 kubelet[2622]: W0321 12:33:52.219800 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.219862 kubelet[2622]: E0321 12:33:52.219846 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.220077 kubelet[2622]: E0321 12:33:52.220055 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.220077 kubelet[2622]: W0321 12:33:52.220069 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.220157 kubelet[2622]: E0321 12:33:52.220084 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.220334 kubelet[2622]: E0321 12:33:52.220312 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.220334 kubelet[2622]: W0321 12:33:52.220326 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.220411 kubelet[2622]: E0321 12:33:52.220340 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.220584 kubelet[2622]: E0321 12:33:52.220559 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.220584 kubelet[2622]: W0321 12:33:52.220573 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.220640 kubelet[2622]: E0321 12:33:52.220588 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.221747 kubelet[2622]: E0321 12:33:52.221711 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.221747 kubelet[2622]: W0321 12:33:52.221730 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.221747 kubelet[2622]: E0321 12:33:52.221749 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.223016 kubelet[2622]: E0321 12:33:52.222658 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.223016 kubelet[2622]: W0321 12:33:52.222680 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.223016 kubelet[2622]: E0321 12:33:52.222957 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.223842 kubelet[2622]: E0321 12:33:52.223815 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.223842 kubelet[2622]: W0321 12:33:52.223833 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.224513 kubelet[2622]: E0321 12:33:52.223870 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.225215 kubelet[2622]: E0321 12:33:52.225191 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.225215 kubelet[2622]: W0321 12:33:52.225207 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.225303 kubelet[2622]: E0321 12:33:52.225243 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.225636 kubelet[2622]: E0321 12:33:52.225506 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.225636 kubelet[2622]: W0321 12:33:52.225631 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.225697 kubelet[2622]: E0321 12:33:52.225662 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.225867 kubelet[2622]: E0321 12:33:52.225840 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.225867 kubelet[2622]: W0321 12:33:52.225855 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.225956 kubelet[2622]: E0321 12:33:52.225874 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.226185 kubelet[2622]: E0321 12:33:52.226163 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.226185 kubelet[2622]: W0321 12:33:52.226176 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.226254 kubelet[2622]: E0321 12:33:52.226193 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.226592 kubelet[2622]: E0321 12:33:52.226543 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.226592 kubelet[2622]: W0321 12:33:52.226587 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.226668 kubelet[2622]: E0321 12:33:52.226619 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.227148 kubelet[2622]: E0321 12:33:52.226966 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.227148 kubelet[2622]: W0321 12:33:52.226989 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.227148 kubelet[2622]: E0321 12:33:52.227008 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.227266 kubelet[2622]: E0321 12:33:52.227210 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.227266 kubelet[2622]: W0321 12:33:52.227221 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.227266 kubelet[2622]: E0321 12:33:52.227249 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.227716 kubelet[2622]: E0321 12:33:52.227569 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.227716 kubelet[2622]: W0321 12:33:52.227590 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.227716 kubelet[2622]: E0321 12:33:52.227601 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:52.228192 kubelet[2622]: E0321 12:33:52.228156 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:52.228192 kubelet[2622]: W0321 12:33:52.228173 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:52.228192 kubelet[2622]: E0321 12:33:52.228185 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.103546 kubelet[2622]: E0321 12:33:53.103495 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x5fd2" podUID="819cfe7b-19b8-4e1c-86f0-166cf2f4d650" Mar 21 12:33:53.179542 kubelet[2622]: I0321 12:33:53.179503 2622 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 12:33:53.179940 kubelet[2622]: E0321 12:33:53.179824 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:53.201495 containerd[1483]: time="2025-03-21T12:33:53.201452894Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:53.202562 containerd[1483]: time="2025-03-21T12:33:53.202504215Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 21 12:33:53.203437 containerd[1483]: time="2025-03-21T12:33:53.203400576Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:53.204370 kubelet[2622]: E0321 12:33:53.204266 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.204370 kubelet[2622]: W0321 12:33:53.204285 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.204370 kubelet[2622]: E0321 12:33:53.204304 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.204530 kubelet[2622]: E0321 12:33:53.204474 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.204530 kubelet[2622]: W0321 12:33:53.204484 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.204530 kubelet[2622]: E0321 12:33:53.204524 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.204711 kubelet[2622]: E0321 12:33:53.204699 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.204711 kubelet[2622]: W0321 12:33:53.204710 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.204777 kubelet[2622]: E0321 12:33:53.204718 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.204875 kubelet[2622]: E0321 12:33:53.204864 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.204875 kubelet[2622]: W0321 12:33:53.204873 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.204957 kubelet[2622]: E0321 12:33:53.204880 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.205070 kubelet[2622]: E0321 12:33:53.205060 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.205070 kubelet[2622]: W0321 12:33:53.205070 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.205136 kubelet[2622]: E0321 12:33:53.205078 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.205253 kubelet[2622]: E0321 12:33:53.205240 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.205253 kubelet[2622]: W0321 12:33:53.205249 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.205358 kubelet[2622]: E0321 12:33:53.205256 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.205423 containerd[1483]: time="2025-03-21T12:33:53.205282138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:53.205482 kubelet[2622]: E0321 12:33:53.205398 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.205482 kubelet[2622]: W0321 12:33:53.205405 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.205482 kubelet[2622]: E0321 12:33:53.205413 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.205582 kubelet[2622]: E0321 12:33:53.205569 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.205582 kubelet[2622]: W0321 12:33:53.205576 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.205766 kubelet[2622]: E0321 12:33:53.205585 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.205766 kubelet[2622]: E0321 12:33:53.205725 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.205766 kubelet[2622]: W0321 12:33:53.205733 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.205766 kubelet[2622]: E0321 12:33:53.205740 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.205986 kubelet[2622]: E0321 12:33:53.205893 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.205986 kubelet[2622]: W0321 12:33:53.205901 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.205986 kubelet[2622]: E0321 12:33:53.205909 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.206280 containerd[1483]: time="2025-03-21T12:33:53.205809099Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.226226645s" Mar 21 12:33:53.206280 containerd[1483]: time="2025-03-21T12:33:53.205839739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 21 12:33:53.206353 kubelet[2622]: E0321 12:33:53.206080 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.206353 kubelet[2622]: W0321 12:33:53.206088 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.206353 kubelet[2622]: E0321 12:33:53.206096 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.206353 kubelet[2622]: E0321 12:33:53.206249 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.206353 kubelet[2622]: W0321 12:33:53.206262 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.206353 kubelet[2622]: E0321 12:33:53.206270 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.206502 kubelet[2622]: E0321 12:33:53.206425 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.206502 kubelet[2622]: W0321 12:33:53.206434 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.206502 kubelet[2622]: E0321 12:33:53.206442 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.206630 kubelet[2622]: E0321 12:33:53.206575 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.206630 kubelet[2622]: W0321 12:33:53.206585 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.206630 kubelet[2622]: E0321 12:33:53.206594 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.206765 kubelet[2622]: E0321 12:33:53.206755 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.206765 kubelet[2622]: W0321 12:33:53.206764 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.206813 kubelet[2622]: E0321 12:33:53.206771 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.208269 containerd[1483]: time="2025-03-21T12:33:53.208236342Z" level=info msg="CreateContainer within sandbox \"e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 21 12:33:53.228224 kubelet[2622]: E0321 12:33:53.228089 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.228224 kubelet[2622]: W0321 12:33:53.228108 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.228224 kubelet[2622]: E0321 12:33:53.228123 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.228469 kubelet[2622]: E0321 12:33:53.228455 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.228520 kubelet[2622]: W0321 12:33:53.228510 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.228580 kubelet[2622]: E0321 12:33:53.228570 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.228864 kubelet[2622]: E0321 12:33:53.228849 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.229059 kubelet[2622]: W0321 12:33:53.228978 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.229059 kubelet[2622]: E0321 12:33:53.229006 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.229235 kubelet[2622]: E0321 12:33:53.229199 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.229235 kubelet[2622]: W0321 12:33:53.229217 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.229235 kubelet[2622]: E0321 12:33:53.229234 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.230027 kubelet[2622]: E0321 12:33:53.230010 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.230190 kubelet[2622]: W0321 12:33:53.230076 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.230190 kubelet[2622]: E0321 12:33:53.230096 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.230354 kubelet[2622]: E0321 12:33:53.230328 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.230354 kubelet[2622]: W0321 12:33:53.230341 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.230535 kubelet[2622]: E0321 12:33:53.230490 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.230710 kubelet[2622]: E0321 12:33:53.230697 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.230819 kubelet[2622]: W0321 12:33:53.230758 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.230819 kubelet[2622]: E0321 12:33:53.230788 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.231175 kubelet[2622]: E0321 12:33:53.231092 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.231175 kubelet[2622]: W0321 12:33:53.231106 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.231175 kubelet[2622]: E0321 12:33:53.231133 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.231557 kubelet[2622]: E0321 12:33:53.231421 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.231557 kubelet[2622]: W0321 12:33:53.231436 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.231557 kubelet[2622]: E0321 12:33:53.231453 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.231771 kubelet[2622]: E0321 12:33:53.231674 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.231771 kubelet[2622]: W0321 12:33:53.231686 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.231771 kubelet[2622]: E0321 12:33:53.231700 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.231871 kubelet[2622]: E0321 12:33:53.231848 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.231871 kubelet[2622]: W0321 12:33:53.231855 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.231871 kubelet[2622]: E0321 12:33:53.231863 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.232050 kubelet[2622]: E0321 12:33:53.232040 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.232050 kubelet[2622]: W0321 12:33:53.232050 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.232101 kubelet[2622]: E0321 12:33:53.232059 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.232410 kubelet[2622]: E0321 12:33:53.232395 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.232410 kubelet[2622]: W0321 12:33:53.232408 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.232493 kubelet[2622]: E0321 12:33:53.232421 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.232660 kubelet[2622]: E0321 12:33:53.232621 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.232660 kubelet[2622]: W0321 12:33:53.232633 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.232660 kubelet[2622]: E0321 12:33:53.232647 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.232810 kubelet[2622]: E0321 12:33:53.232797 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.232810 kubelet[2622]: W0321 12:33:53.232808 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.232866 kubelet[2622]: E0321 12:33:53.232818 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.233021 kubelet[2622]: E0321 12:33:53.233010 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.233021 kubelet[2622]: W0321 12:33:53.233020 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.233294 kubelet[2622]: E0321 12:33:53.233026 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.233294 kubelet[2622]: E0321 12:33:53.233203 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.233294 kubelet[2622]: W0321 12:33:53.233210 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.233294 kubelet[2622]: E0321 12:33:53.233217 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.233628 kubelet[2622]: E0321 12:33:53.233612 2622 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 12:33:53.233628 kubelet[2622]: W0321 12:33:53.233627 2622 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 12:33:53.233684 kubelet[2622]: E0321 12:33:53.233636 2622 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 12:33:53.237495 containerd[1483]: time="2025-03-21T12:33:53.237459895Z" level=info msg="Container 504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:53.238540 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1106212057.mount: Deactivated successfully. Mar 21 12:33:53.244324 containerd[1483]: time="2025-03-21T12:33:53.244224903Z" level=info msg="CreateContainer within sandbox \"e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b\"" Mar 21 12:33:53.244850 containerd[1483]: time="2025-03-21T12:33:53.244828744Z" level=info msg="StartContainer for \"504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b\"" Mar 21 12:33:53.246127 containerd[1483]: time="2025-03-21T12:33:53.246103985Z" level=info msg="connecting to shim 504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b" address="unix:///run/containerd/s/0a22203a1761ee7ce291d3e407337f9de945df5972c31e98a9914889464e8f36" protocol=ttrpc version=3 Mar 21 12:33:53.269092 systemd[1]: Started cri-containerd-504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b.scope - libcontainer container 504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b. Mar 21 12:33:53.322387 containerd[1483]: time="2025-03-21T12:33:53.322327552Z" level=info msg="StartContainer for \"504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b\" returns successfully" Mar 21 12:33:53.331770 systemd[1]: cri-containerd-504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b.scope: Deactivated successfully. Mar 21 12:33:53.363927 containerd[1483]: time="2025-03-21T12:33:53.363268799Z" level=info msg="received exit event container_id:\"504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b\" id:\"504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b\" pid:3326 exited_at:{seconds:1742560433 nanos:352049626}" Mar 21 12:33:53.363927 containerd[1483]: time="2025-03-21T12:33:53.363443439Z" level=info msg="TaskExit event in podsandbox handler container_id:\"504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b\" id:\"504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b\" pid:3326 exited_at:{seconds:1742560433 nanos:352049626}" Mar 21 12:33:53.410093 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-504a236c9cb9f78c4e3692494cd8fea9212b9a4596b3a6755f6eae79794a936b-rootfs.mount: Deactivated successfully. Mar 21 12:33:54.183431 kubelet[2622]: E0321 12:33:54.183328 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:54.184145 containerd[1483]: time="2025-03-21T12:33:54.184069207Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 21 12:33:54.198304 kubelet[2622]: I0321 12:33:54.198239 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-547fbb5d8b-fb2t2" podStartSLOduration=2.885879781 podStartE2EDuration="5.198222742s" podCreationTimestamp="2025-03-21 12:33:49 +0000 UTC" firstStartedPulling="2025-03-21 12:33:49.667011413 +0000 UTC m=+13.645682260" lastFinishedPulling="2025-03-21 12:33:51.979354374 +0000 UTC m=+15.958025221" observedRunningTime="2025-03-21 12:33:52.193940478 +0000 UTC m=+16.172611325" watchObservedRunningTime="2025-03-21 12:33:54.198222742 +0000 UTC m=+18.176893549" Mar 21 12:33:55.103298 kubelet[2622]: E0321 12:33:55.103258 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x5fd2" podUID="819cfe7b-19b8-4e1c-86f0-166cf2f4d650" Mar 21 12:33:57.102646 kubelet[2622]: E0321 12:33:57.102600 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x5fd2" podUID="819cfe7b-19b8-4e1c-86f0-166cf2f4d650" Mar 21 12:33:58.278229 containerd[1483]: time="2025-03-21T12:33:58.278178310Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:58.278697 containerd[1483]: time="2025-03-21T12:33:58.278647550Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 21 12:33:58.279435 containerd[1483]: time="2025-03-21T12:33:58.279407271Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:58.281221 containerd[1483]: time="2025-03-21T12:33:58.281172952Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:33:58.281951 containerd[1483]: time="2025-03-21T12:33:58.281911913Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 4.097798986s" Mar 21 12:33:58.282002 containerd[1483]: time="2025-03-21T12:33:58.281952793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 21 12:33:58.284079 containerd[1483]: time="2025-03-21T12:33:58.284047315Z" level=info msg="CreateContainer within sandbox \"e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 21 12:33:58.291683 containerd[1483]: time="2025-03-21T12:33:58.289232799Z" level=info msg="Container 13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:33:58.297115 containerd[1483]: time="2025-03-21T12:33:58.297067765Z" level=info msg="CreateContainer within sandbox \"e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75\"" Mar 21 12:33:58.297743 containerd[1483]: time="2025-03-21T12:33:58.297500326Z" level=info msg="StartContainer for \"13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75\"" Mar 21 12:33:58.298824 containerd[1483]: time="2025-03-21T12:33:58.298796167Z" level=info msg="connecting to shim 13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75" address="unix:///run/containerd/s/0a22203a1761ee7ce291d3e407337f9de945df5972c31e98a9914889464e8f36" protocol=ttrpc version=3 Mar 21 12:33:58.320081 systemd[1]: Started cri-containerd-13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75.scope - libcontainer container 13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75. Mar 21 12:33:58.364968 containerd[1483]: time="2025-03-21T12:33:58.364210301Z" level=info msg="StartContainer for \"13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75\" returns successfully" Mar 21 12:33:58.949372 systemd[1]: cri-containerd-13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75.scope: Deactivated successfully. Mar 21 12:33:58.949802 systemd[1]: cri-containerd-13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75.scope: Consumed 460ms CPU time, 159.5M memory peak, 4K read from disk, 150.3M written to disk. Mar 21 12:33:58.956155 containerd[1483]: time="2025-03-21T12:33:58.956086552Z" level=info msg="received exit event container_id:\"13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75\" id:\"13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75\" pid:3386 exited_at:{seconds:1742560438 nanos:955884752}" Mar 21 12:33:58.956264 containerd[1483]: time="2025-03-21T12:33:58.956239632Z" level=info msg="TaskExit event in podsandbox handler container_id:\"13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75\" id:\"13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75\" pid:3386 exited_at:{seconds:1742560438 nanos:955884752}" Mar 21 12:33:58.973813 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-13713f1c7a87da6f21b673f6eba253a8eb7fae59301cb1a9577ce25d4efccb75-rootfs.mount: Deactivated successfully. Mar 21 12:33:58.981818 kubelet[2622]: I0321 12:33:58.981778 2622 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 21 12:33:59.022099 systemd[1]: Created slice kubepods-besteffort-podf07a8b17_5d2d_449c_b660_373aeac8779a.slice - libcontainer container kubepods-besteffort-podf07a8b17_5d2d_449c_b660_373aeac8779a.slice. Mar 21 12:33:59.032202 systemd[1]: Created slice kubepods-burstable-poddf7ef3b5_d48b_47d8_b992_f217ba61f745.slice - libcontainer container kubepods-burstable-poddf7ef3b5_d48b_47d8_b992_f217ba61f745.slice. Mar 21 12:33:59.036782 systemd[1]: Created slice kubepods-besteffort-pod7279b680_a2d0_4d33_907f_eca56de0a976.slice - libcontainer container kubepods-besteffort-pod7279b680_a2d0_4d33_907f_eca56de0a976.slice. Mar 21 12:33:59.045568 systemd[1]: Created slice kubepods-burstable-pod4b998b44_1450_4f89_99b4_102d22ff0e46.slice - libcontainer container kubepods-burstable-pod4b998b44_1450_4f89_99b4_102d22ff0e46.slice. Mar 21 12:33:59.057275 systemd[1]: Created slice kubepods-besteffort-pod6da3189b_eb72_4225_933a_4a863afc15d4.slice - libcontainer container kubepods-besteffort-pod6da3189b_eb72_4225_933a_4a863afc15d4.slice. Mar 21 12:33:59.071347 kubelet[2622]: I0321 12:33:59.071295 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df7ef3b5-d48b-47d8-b992-f217ba61f745-config-volume\") pod \"coredns-668d6bf9bc-zlht7\" (UID: \"df7ef3b5-d48b-47d8-b992-f217ba61f745\") " pod="kube-system/coredns-668d6bf9bc-zlht7" Mar 21 12:33:59.071347 kubelet[2622]: I0321 12:33:59.071344 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9nh\" (UniqueName: \"kubernetes.io/projected/6da3189b-eb72-4225-933a-4a863afc15d4-kube-api-access-km9nh\") pod \"calico-apiserver-8675c54584-xs442\" (UID: \"6da3189b-eb72-4225-933a-4a863afc15d4\") " pod="calico-apiserver/calico-apiserver-8675c54584-xs442" Mar 21 12:33:59.071578 kubelet[2622]: I0321 12:33:59.071376 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjg5m\" (UniqueName: \"kubernetes.io/projected/df7ef3b5-d48b-47d8-b992-f217ba61f745-kube-api-access-sjg5m\") pod \"coredns-668d6bf9bc-zlht7\" (UID: \"df7ef3b5-d48b-47d8-b992-f217ba61f745\") " pod="kube-system/coredns-668d6bf9bc-zlht7" Mar 21 12:33:59.071578 kubelet[2622]: I0321 12:33:59.071395 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7mwk\" (UniqueName: \"kubernetes.io/projected/4b998b44-1450-4f89-99b4-102d22ff0e46-kube-api-access-s7mwk\") pod \"coredns-668d6bf9bc-czlm2\" (UID: \"4b998b44-1450-4f89-99b4-102d22ff0e46\") " pod="kube-system/coredns-668d6bf9bc-czlm2" Mar 21 12:33:59.071578 kubelet[2622]: I0321 12:33:59.071412 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f07a8b17-5d2d-449c-b660-373aeac8779a-tigera-ca-bundle\") pod \"calico-kube-controllers-5cbdffc7d-9rvwr\" (UID: \"f07a8b17-5d2d-449c-b660-373aeac8779a\") " pod="calico-system/calico-kube-controllers-5cbdffc7d-9rvwr" Mar 21 12:33:59.071578 kubelet[2622]: I0321 12:33:59.071428 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6j6s\" (UniqueName: \"kubernetes.io/projected/7279b680-a2d0-4d33-907f-eca56de0a976-kube-api-access-t6j6s\") pod \"calico-apiserver-8675c54584-755sn\" (UID: \"7279b680-a2d0-4d33-907f-eca56de0a976\") " pod="calico-apiserver/calico-apiserver-8675c54584-755sn" Mar 21 12:33:59.071578 kubelet[2622]: I0321 12:33:59.071447 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6da3189b-eb72-4225-933a-4a863afc15d4-calico-apiserver-certs\") pod \"calico-apiserver-8675c54584-xs442\" (UID: \"6da3189b-eb72-4225-933a-4a863afc15d4\") " pod="calico-apiserver/calico-apiserver-8675c54584-xs442" Mar 21 12:33:59.071693 kubelet[2622]: I0321 12:33:59.071464 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nlp8\" (UniqueName: \"kubernetes.io/projected/f07a8b17-5d2d-449c-b660-373aeac8779a-kube-api-access-5nlp8\") pod \"calico-kube-controllers-5cbdffc7d-9rvwr\" (UID: \"f07a8b17-5d2d-449c-b660-373aeac8779a\") " pod="calico-system/calico-kube-controllers-5cbdffc7d-9rvwr" Mar 21 12:33:59.071693 kubelet[2622]: I0321 12:33:59.071483 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7279b680-a2d0-4d33-907f-eca56de0a976-calico-apiserver-certs\") pod \"calico-apiserver-8675c54584-755sn\" (UID: \"7279b680-a2d0-4d33-907f-eca56de0a976\") " pod="calico-apiserver/calico-apiserver-8675c54584-755sn" Mar 21 12:33:59.071871 kubelet[2622]: I0321 12:33:59.071795 2622 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b998b44-1450-4f89-99b4-102d22ff0e46-config-volume\") pod \"coredns-668d6bf9bc-czlm2\" (UID: \"4b998b44-1450-4f89-99b4-102d22ff0e46\") " pod="kube-system/coredns-668d6bf9bc-czlm2" Mar 21 12:33:59.107396 systemd[1]: Created slice kubepods-besteffort-pod819cfe7b_19b8_4e1c_86f0_166cf2f4d650.slice - libcontainer container kubepods-besteffort-pod819cfe7b_19b8_4e1c_86f0_166cf2f4d650.slice. Mar 21 12:33:59.109954 containerd[1483]: time="2025-03-21T12:33:59.109848074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x5fd2,Uid:819cfe7b-19b8-4e1c-86f0-166cf2f4d650,Namespace:calico-system,Attempt:0,}" Mar 21 12:33:59.207663 kubelet[2622]: E0321 12:33:59.207149 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:59.208679 containerd[1483]: time="2025-03-21T12:33:59.208639911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 21 12:33:59.291170 containerd[1483]: time="2025-03-21T12:33:59.291122295Z" level=error msg="Failed to destroy network for sandbox \"9de8b1482d97914e8fe698cf37d6189db2a5f18bca4b7a1eb6e23b5ba0e5ab3e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.292030 containerd[1483]: time="2025-03-21T12:33:59.291983136Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x5fd2,Uid:819cfe7b-19b8-4e1c-86f0-166cf2f4d650,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9de8b1482d97914e8fe698cf37d6189db2a5f18bca4b7a1eb6e23b5ba0e5ab3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.296052 kubelet[2622]: E0321 12:33:59.296000 2622 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9de8b1482d97914e8fe698cf37d6189db2a5f18bca4b7a1eb6e23b5ba0e5ab3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.298715 kubelet[2622]: E0321 12:33:59.298675 2622 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9de8b1482d97914e8fe698cf37d6189db2a5f18bca4b7a1eb6e23b5ba0e5ab3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x5fd2" Mar 21 12:33:59.298715 kubelet[2622]: E0321 12:33:59.298714 2622 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9de8b1482d97914e8fe698cf37d6189db2a5f18bca4b7a1eb6e23b5ba0e5ab3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x5fd2" Mar 21 12:33:59.298820 kubelet[2622]: E0321 12:33:59.298760 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x5fd2_calico-system(819cfe7b-19b8-4e1c-86f0-166cf2f4d650)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x5fd2_calico-system(819cfe7b-19b8-4e1c-86f0-166cf2f4d650)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9de8b1482d97914e8fe698cf37d6189db2a5f18bca4b7a1eb6e23b5ba0e5ab3e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x5fd2" podUID="819cfe7b-19b8-4e1c-86f0-166cf2f4d650" Mar 21 12:33:59.300487 systemd[1]: run-netns-cni\x2dcf7971d3\x2d2145\x2df796\x2d6b1c\x2d103ce2609044.mount: Deactivated successfully. Mar 21 12:33:59.326662 containerd[1483]: time="2025-03-21T12:33:59.326611363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbdffc7d-9rvwr,Uid:f07a8b17-5d2d-449c-b660-373aeac8779a,Namespace:calico-system,Attempt:0,}" Mar 21 12:33:59.338921 kubelet[2622]: E0321 12:33:59.338880 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:59.340171 containerd[1483]: time="2025-03-21T12:33:59.340134773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-755sn,Uid:7279b680-a2d0-4d33-907f-eca56de0a976,Namespace:calico-apiserver,Attempt:0,}" Mar 21 12:33:59.341314 containerd[1483]: time="2025-03-21T12:33:59.341292494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zlht7,Uid:df7ef3b5-d48b-47d8-b992-f217ba61f745,Namespace:kube-system,Attempt:0,}" Mar 21 12:33:59.354631 kubelet[2622]: E0321 12:33:59.354138 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:33:59.359064 containerd[1483]: time="2025-03-21T12:33:59.358452187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-czlm2,Uid:4b998b44-1450-4f89-99b4-102d22ff0e46,Namespace:kube-system,Attempt:0,}" Mar 21 12:33:59.362063 containerd[1483]: time="2025-03-21T12:33:59.362019910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-xs442,Uid:6da3189b-eb72-4225-933a-4a863afc15d4,Namespace:calico-apiserver,Attempt:0,}" Mar 21 12:33:59.410507 containerd[1483]: time="2025-03-21T12:33:59.410455348Z" level=error msg="Failed to destroy network for sandbox \"48e6ac01a5600ced52ac8f60bee28cee7993fe8dcae4647704f35e2c62dc5564\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.412298 containerd[1483]: time="2025-03-21T12:33:59.412180429Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbdffc7d-9rvwr,Uid:f07a8b17-5d2d-449c-b660-373aeac8779a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"48e6ac01a5600ced52ac8f60bee28cee7993fe8dcae4647704f35e2c62dc5564\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.412633 kubelet[2622]: E0321 12:33:59.412588 2622 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48e6ac01a5600ced52ac8f60bee28cee7993fe8dcae4647704f35e2c62dc5564\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.412794 kubelet[2622]: E0321 12:33:59.412642 2622 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48e6ac01a5600ced52ac8f60bee28cee7993fe8dcae4647704f35e2c62dc5564\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cbdffc7d-9rvwr" Mar 21 12:33:59.412794 kubelet[2622]: E0321 12:33:59.412662 2622 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48e6ac01a5600ced52ac8f60bee28cee7993fe8dcae4647704f35e2c62dc5564\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cbdffc7d-9rvwr" Mar 21 12:33:59.412794 kubelet[2622]: E0321 12:33:59.412709 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5cbdffc7d-9rvwr_calico-system(f07a8b17-5d2d-449c-b660-373aeac8779a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5cbdffc7d-9rvwr_calico-system(f07a8b17-5d2d-449c-b660-373aeac8779a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48e6ac01a5600ced52ac8f60bee28cee7993fe8dcae4647704f35e2c62dc5564\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5cbdffc7d-9rvwr" podUID="f07a8b17-5d2d-449c-b660-373aeac8779a" Mar 21 12:33:59.417012 containerd[1483]: time="2025-03-21T12:33:59.416971353Z" level=error msg="Failed to destroy network for sandbox \"feeb486ff712592d3131e558189deb3e7689c0626349cd0fe0c69837faf0bfaf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.418051 containerd[1483]: time="2025-03-21T12:33:59.418009354Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zlht7,Uid:df7ef3b5-d48b-47d8-b992-f217ba61f745,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"feeb486ff712592d3131e558189deb3e7689c0626349cd0fe0c69837faf0bfaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.418240 kubelet[2622]: E0321 12:33:59.418195 2622 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feeb486ff712592d3131e558189deb3e7689c0626349cd0fe0c69837faf0bfaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.418548 kubelet[2622]: E0321 12:33:59.418246 2622 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feeb486ff712592d3131e558189deb3e7689c0626349cd0fe0c69837faf0bfaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zlht7" Mar 21 12:33:59.418548 kubelet[2622]: E0321 12:33:59.418268 2622 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feeb486ff712592d3131e558189deb3e7689c0626349cd0fe0c69837faf0bfaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zlht7" Mar 21 12:33:59.418548 kubelet[2622]: E0321 12:33:59.418311 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-zlht7_kube-system(df7ef3b5-d48b-47d8-b992-f217ba61f745)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-zlht7_kube-system(df7ef3b5-d48b-47d8-b992-f217ba61f745)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"feeb486ff712592d3131e558189deb3e7689c0626349cd0fe0c69837faf0bfaf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-zlht7" podUID="df7ef3b5-d48b-47d8-b992-f217ba61f745" Mar 21 12:33:59.421064 containerd[1483]: time="2025-03-21T12:33:59.421032716Z" level=error msg="Failed to destroy network for sandbox \"c8d626b5b4fc59f52a774ed63f38e8d659be8a8ce2b575fe6a578362e56631a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.422251 containerd[1483]: time="2025-03-21T12:33:59.422214757Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-czlm2,Uid:4b998b44-1450-4f89-99b4-102d22ff0e46,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8d626b5b4fc59f52a774ed63f38e8d659be8a8ce2b575fe6a578362e56631a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.422431 containerd[1483]: time="2025-03-21T12:33:59.422303157Z" level=error msg="Failed to destroy network for sandbox \"2e96b8f35d1e6d8a9f5df5b7b08fd6d1b8dd88bb3b1bee84ccdc8a55520c3197\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.422599 kubelet[2622]: E0321 12:33:59.422557 2622 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8d626b5b4fc59f52a774ed63f38e8d659be8a8ce2b575fe6a578362e56631a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.422666 kubelet[2622]: E0321 12:33:59.422612 2622 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8d626b5b4fc59f52a774ed63f38e8d659be8a8ce2b575fe6a578362e56631a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-czlm2" Mar 21 12:33:59.422666 kubelet[2622]: E0321 12:33:59.422630 2622 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8d626b5b4fc59f52a774ed63f38e8d659be8a8ce2b575fe6a578362e56631a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-czlm2" Mar 21 12:33:59.422730 kubelet[2622]: E0321 12:33:59.422660 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-czlm2_kube-system(4b998b44-1450-4f89-99b4-102d22ff0e46)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-czlm2_kube-system(4b998b44-1450-4f89-99b4-102d22ff0e46)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c8d626b5b4fc59f52a774ed63f38e8d659be8a8ce2b575fe6a578362e56631a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-czlm2" podUID="4b998b44-1450-4f89-99b4-102d22ff0e46" Mar 21 12:33:59.423540 containerd[1483]: time="2025-03-21T12:33:59.423508278Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-755sn,Uid:7279b680-a2d0-4d33-907f-eca56de0a976,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e96b8f35d1e6d8a9f5df5b7b08fd6d1b8dd88bb3b1bee84ccdc8a55520c3197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.423890 kubelet[2622]: E0321 12:33:59.423756 2622 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e96b8f35d1e6d8a9f5df5b7b08fd6d1b8dd88bb3b1bee84ccdc8a55520c3197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.423890 kubelet[2622]: E0321 12:33:59.423798 2622 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e96b8f35d1e6d8a9f5df5b7b08fd6d1b8dd88bb3b1bee84ccdc8a55520c3197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8675c54584-755sn" Mar 21 12:33:59.423890 kubelet[2622]: E0321 12:33:59.423814 2622 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e96b8f35d1e6d8a9f5df5b7b08fd6d1b8dd88bb3b1bee84ccdc8a55520c3197\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8675c54584-755sn" Mar 21 12:33:59.424090 kubelet[2622]: E0321 12:33:59.423847 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8675c54584-755sn_calico-apiserver(7279b680-a2d0-4d33-907f-eca56de0a976)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8675c54584-755sn_calico-apiserver(7279b680-a2d0-4d33-907f-eca56de0a976)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e96b8f35d1e6d8a9f5df5b7b08fd6d1b8dd88bb3b1bee84ccdc8a55520c3197\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8675c54584-755sn" podUID="7279b680-a2d0-4d33-907f-eca56de0a976" Mar 21 12:33:59.439718 containerd[1483]: time="2025-03-21T12:33:59.439666171Z" level=error msg="Failed to destroy network for sandbox \"1cbdfffd0ea569a5c4462e8e0219bfa0c354b9be39efb1df139c91f43d544ee5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.440559 containerd[1483]: time="2025-03-21T12:33:59.440531651Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-xs442,Uid:6da3189b-eb72-4225-933a-4a863afc15d4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cbdfffd0ea569a5c4462e8e0219bfa0c354b9be39efb1df139c91f43d544ee5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.440709 kubelet[2622]: E0321 12:33:59.440682 2622 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cbdfffd0ea569a5c4462e8e0219bfa0c354b9be39efb1df139c91f43d544ee5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 12:33:59.440755 kubelet[2622]: E0321 12:33:59.440727 2622 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cbdfffd0ea569a5c4462e8e0219bfa0c354b9be39efb1df139c91f43d544ee5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8675c54584-xs442" Mar 21 12:33:59.440755 kubelet[2622]: E0321 12:33:59.440746 2622 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cbdfffd0ea569a5c4462e8e0219bfa0c354b9be39efb1df139c91f43d544ee5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8675c54584-xs442" Mar 21 12:33:59.440826 kubelet[2622]: E0321 12:33:59.440784 2622 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8675c54584-xs442_calico-apiserver(6da3189b-eb72-4225-933a-4a863afc15d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8675c54584-xs442_calico-apiserver(6da3189b-eb72-4225-933a-4a863afc15d4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1cbdfffd0ea569a5c4462e8e0219bfa0c354b9be39efb1df139c91f43d544ee5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8675c54584-xs442" podUID="6da3189b-eb72-4225-933a-4a863afc15d4" Mar 21 12:34:00.293052 systemd[1]: run-netns-cni\x2deaa28231\x2daf3d\x2d027c\x2d1d7c\x2df52573448e72.mount: Deactivated successfully. Mar 21 12:34:00.293360 systemd[1]: run-netns-cni\x2d8e2f3ffb\x2df594\x2db774\x2d91c2\x2d1586f7c473c5.mount: Deactivated successfully. Mar 21 12:34:00.293433 systemd[1]: run-netns-cni\x2ddf60595a\x2dee60\x2d5650\x2d2555\x2da031d1b032f0.mount: Deactivated successfully. Mar 21 12:34:02.718118 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount718251146.mount: Deactivated successfully. Mar 21 12:34:02.983289 containerd[1483]: time="2025-03-21T12:34:02.983176929Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:02.984101 containerd[1483]: time="2025-03-21T12:34:02.983911370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 21 12:34:02.988180 containerd[1483]: time="2025-03-21T12:34:02.988147933Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:02.991177 containerd[1483]: time="2025-03-21T12:34:02.991123855Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:02.992283 containerd[1483]: time="2025-03-21T12:34:02.991772055Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 3.782763984s" Mar 21 12:34:02.992283 containerd[1483]: time="2025-03-21T12:34:02.991814055Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 21 12:34:03.015321 containerd[1483]: time="2025-03-21T12:34:03.015270309Z" level=info msg="CreateContainer within sandbox \"e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 21 12:34:03.032678 containerd[1483]: time="2025-03-21T12:34:03.031033999Z" level=info msg="Container 7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:03.043042 containerd[1483]: time="2025-03-21T12:34:03.042982646Z" level=info msg="CreateContainer within sandbox \"e565ca60d41416ecc0ccbba5e14b7da04b896c9e8aaecce7e6196c1e2f64fa1f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35\"" Mar 21 12:34:03.043826 containerd[1483]: time="2025-03-21T12:34:03.043693007Z" level=info msg="StartContainer for \"7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35\"" Mar 21 12:34:03.045145 containerd[1483]: time="2025-03-21T12:34:03.045110207Z" level=info msg="connecting to shim 7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35" address="unix:///run/containerd/s/0a22203a1761ee7ce291d3e407337f9de945df5972c31e98a9914889464e8f36" protocol=ttrpc version=3 Mar 21 12:34:03.077460 systemd[1]: Started cri-containerd-7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35.scope - libcontainer container 7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35. Mar 21 12:34:03.240010 containerd[1483]: time="2025-03-21T12:34:03.239874404Z" level=info msg="StartContainer for \"7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35\" returns successfully" Mar 21 12:34:03.328049 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 21 12:34:03.328193 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 21 12:34:04.245820 kubelet[2622]: E0321 12:34:04.245791 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:04.271431 kubelet[2622]: I0321 12:34:04.270460 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-x2vl6" podStartSLOduration=2.129923535 podStartE2EDuration="15.270440413s" podCreationTimestamp="2025-03-21 12:33:49 +0000 UTC" firstStartedPulling="2025-03-21 12:33:49.863740985 +0000 UTC m=+13.842411832" lastFinishedPulling="2025-03-21 12:34:03.004257863 +0000 UTC m=+26.982928710" observedRunningTime="2025-03-21 12:34:04.268820413 +0000 UTC m=+28.247491380" watchObservedRunningTime="2025-03-21 12:34:04.270440413 +0000 UTC m=+28.249111260" Mar 21 12:34:04.645182 systemd[1]: Started sshd@9-10.0.0.87:22-10.0.0.1:34628.service - OpenSSH per-connection server daemon (10.0.0.1:34628). Mar 21 12:34:04.733226 sshd[3727]: Accepted publickey for core from 10.0.0.1 port 34628 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:04.733301 sshd-session[3727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:04.739886 systemd-logind[1467]: New session 10 of user core. Mar 21 12:34:04.746121 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 21 12:34:04.915837 sshd[3816]: Connection closed by 10.0.0.1 port 34628 Mar 21 12:34:04.916493 sshd-session[3727]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:04.919825 systemd[1]: sshd@9-10.0.0.87:22-10.0.0.1:34628.service: Deactivated successfully. Mar 21 12:34:04.921678 systemd[1]: session-10.scope: Deactivated successfully. Mar 21 12:34:04.922313 systemd-logind[1467]: Session 10 logged out. Waiting for processes to exit. Mar 21 12:34:04.923072 systemd-logind[1467]: Removed session 10. Mar 21 12:34:05.247100 kubelet[2622]: I0321 12:34:05.246996 2622 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 12:34:05.247436 kubelet[2622]: E0321 12:34:05.247385 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:09.930180 systemd[1]: Started sshd@10-10.0.0.87:22-10.0.0.1:34644.service - OpenSSH per-connection server daemon (10.0.0.1:34644). Mar 21 12:34:09.985951 sshd[3937]: Accepted publickey for core from 10.0.0.1 port 34644 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:09.986494 sshd-session[3937]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:09.990862 systemd-logind[1467]: New session 11 of user core. Mar 21 12:34:09.996087 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 21 12:34:10.103827 containerd[1483]: time="2025-03-21T12:34:10.103775795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-755sn,Uid:7279b680-a2d0-4d33-907f-eca56de0a976,Namespace:calico-apiserver,Attempt:0,}" Mar 21 12:34:10.112114 sshd[3956]: Connection closed by 10.0.0.1 port 34644 Mar 21 12:34:10.112384 sshd-session[3937]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:10.115400 systemd[1]: sshd@10-10.0.0.87:22-10.0.0.1:34644.service: Deactivated successfully. Mar 21 12:34:10.117772 systemd[1]: session-11.scope: Deactivated successfully. Mar 21 12:34:10.119587 systemd-logind[1467]: Session 11 logged out. Waiting for processes to exit. Mar 21 12:34:10.121415 systemd-logind[1467]: Removed session 11. Mar 21 12:34:10.300567 systemd-networkd[1399]: calie8e191a170b: Link UP Mar 21 12:34:10.301048 systemd-networkd[1399]: calie8e191a170b: Gained carrier Mar 21 12:34:10.312318 containerd[1483]: 2025-03-21 12:34:10.130 [INFO][3969] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 21 12:34:10.312318 containerd[1483]: 2025-03-21 12:34:10.176 [INFO][3969] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8675c54584--755sn-eth0 calico-apiserver-8675c54584- calico-apiserver 7279b680-a2d0-4d33-907f-eca56de0a976 687 0 2025-03-21 12:33:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8675c54584 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8675c54584-755sn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie8e191a170b [] []}} ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-" Mar 21 12:34:10.312318 containerd[1483]: 2025-03-21 12:34:10.176 [INFO][3969] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" Mar 21 12:34:10.312318 containerd[1483]: 2025-03-21 12:34:10.254 [INFO][3986] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" HandleID="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Workload="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.267 [INFO][3986] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" HandleID="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Workload="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000317330), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8675c54584-755sn", "timestamp":"2025-03-21 12:34:10.254351773 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.267 [INFO][3986] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.268 [INFO][3986] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.268 [INFO][3986] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.270 [INFO][3986] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" host="localhost" Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.274 [INFO][3986] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.278 [INFO][3986] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.280 [INFO][3986] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.281 [INFO][3986] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:10.312547 containerd[1483]: 2025-03-21 12:34:10.282 [INFO][3986] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" host="localhost" Mar 21 12:34:10.312971 containerd[1483]: 2025-03-21 12:34:10.283 [INFO][3986] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0 Mar 21 12:34:10.312971 containerd[1483]: 2025-03-21 12:34:10.286 [INFO][3986] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" host="localhost" Mar 21 12:34:10.312971 containerd[1483]: 2025-03-21 12:34:10.291 [INFO][3986] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" host="localhost" Mar 21 12:34:10.312971 containerd[1483]: 2025-03-21 12:34:10.291 [INFO][3986] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" host="localhost" Mar 21 12:34:10.312971 containerd[1483]: 2025-03-21 12:34:10.291 [INFO][3986] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 12:34:10.312971 containerd[1483]: 2025-03-21 12:34:10.291 [INFO][3986] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" HandleID="k8s-pod-network.6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Workload="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" Mar 21 12:34:10.313103 containerd[1483]: 2025-03-21 12:34:10.293 [INFO][3969] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8675c54584--755sn-eth0", GenerateName:"calico-apiserver-8675c54584-", Namespace:"calico-apiserver", SelfLink:"", UID:"7279b680-a2d0-4d33-907f-eca56de0a976", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8675c54584", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8675c54584-755sn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie8e191a170b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:10.313184 containerd[1483]: 2025-03-21 12:34:10.293 [INFO][3969] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" Mar 21 12:34:10.313184 containerd[1483]: 2025-03-21 12:34:10.293 [INFO][3969] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8e191a170b ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" Mar 21 12:34:10.313184 containerd[1483]: 2025-03-21 12:34:10.300 [INFO][3969] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" Mar 21 12:34:10.313275 containerd[1483]: 2025-03-21 12:34:10.301 [INFO][3969] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8675c54584--755sn-eth0", GenerateName:"calico-apiserver-8675c54584-", Namespace:"calico-apiserver", SelfLink:"", UID:"7279b680-a2d0-4d33-907f-eca56de0a976", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8675c54584", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0", Pod:"calico-apiserver-8675c54584-755sn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie8e191a170b", MAC:"4e:fc:15:f6:2b:2d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:10.313361 containerd[1483]: 2025-03-21 12:34:10.309 [INFO][3969] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-755sn" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--755sn-eth0" Mar 21 12:34:10.409632 containerd[1483]: time="2025-03-21T12:34:10.409566752Z" level=info msg="connecting to shim 6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0" address="unix:///run/containerd/s/e216348a6d4dc2b253fbe24b4a33124a0bc9436d114c27b753fb5b771a6d088e" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:34:10.438089 systemd[1]: Started cri-containerd-6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0.scope - libcontainer container 6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0. Mar 21 12:34:10.448834 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 21 12:34:10.478735 containerd[1483]: time="2025-03-21T12:34:10.478681298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-755sn,Uid:7279b680-a2d0-4d33-907f-eca56de0a976,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0\"" Mar 21 12:34:10.480306 containerd[1483]: time="2025-03-21T12:34:10.480275899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 21 12:34:11.104141 containerd[1483]: time="2025-03-21T12:34:11.104087895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x5fd2,Uid:819cfe7b-19b8-4e1c-86f0-166cf2f4d650,Namespace:calico-system,Attempt:0,}" Mar 21 12:34:11.234232 systemd-networkd[1399]: cali6f09fdbee66: Link UP Mar 21 12:34:11.234437 systemd-networkd[1399]: cali6f09fdbee66: Gained carrier Mar 21 12:34:11.254168 containerd[1483]: 2025-03-21 12:34:11.124 [INFO][4080] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 21 12:34:11.254168 containerd[1483]: 2025-03-21 12:34:11.138 [INFO][4080] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--x5fd2-eth0 csi-node-driver- calico-system 819cfe7b-19b8-4e1c-86f0-166cf2f4d650 602 0 2025-03-21 12:33:49 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-x5fd2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6f09fdbee66 [] []}} ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-" Mar 21 12:34:11.254168 containerd[1483]: 2025-03-21 12:34:11.138 [INFO][4080] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-eth0" Mar 21 12:34:11.254168 containerd[1483]: 2025-03-21 12:34:11.162 [INFO][4094] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" HandleID="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Workload="localhost-k8s-csi--node--driver--x5fd2-eth0" Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.186 [INFO][4094] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" HandleID="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Workload="localhost-k8s-csi--node--driver--x5fd2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003c2d20), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-x5fd2", "timestamp":"2025-03-21 12:34:11.162686756 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.186 [INFO][4094] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.186 [INFO][4094] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.187 [INFO][4094] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.189 [INFO][4094] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" host="localhost" Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.192 [INFO][4094] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.196 [INFO][4094] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.197 [INFO][4094] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.199 [INFO][4094] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:11.254512 containerd[1483]: 2025-03-21 12:34:11.199 [INFO][4094] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" host="localhost" Mar 21 12:34:11.254835 containerd[1483]: 2025-03-21 12:34:11.201 [INFO][4094] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4 Mar 21 12:34:11.254835 containerd[1483]: 2025-03-21 12:34:11.211 [INFO][4094] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" host="localhost" Mar 21 12:34:11.254835 containerd[1483]: 2025-03-21 12:34:11.230 [INFO][4094] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" host="localhost" Mar 21 12:34:11.254835 containerd[1483]: 2025-03-21 12:34:11.230 [INFO][4094] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" host="localhost" Mar 21 12:34:11.254835 containerd[1483]: 2025-03-21 12:34:11.230 [INFO][4094] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 12:34:11.254835 containerd[1483]: 2025-03-21 12:34:11.230 [INFO][4094] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" HandleID="k8s-pod-network.8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Workload="localhost-k8s-csi--node--driver--x5fd2-eth0" Mar 21 12:34:11.254992 containerd[1483]: 2025-03-21 12:34:11.232 [INFO][4080] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--x5fd2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"819cfe7b-19b8-4e1c-86f0-166cf2f4d650", ResourceVersion:"602", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-x5fd2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6f09fdbee66", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:11.254992 containerd[1483]: 2025-03-21 12:34:11.232 [INFO][4080] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-eth0" Mar 21 12:34:11.255071 containerd[1483]: 2025-03-21 12:34:11.232 [INFO][4080] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f09fdbee66 ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-eth0" Mar 21 12:34:11.255071 containerd[1483]: 2025-03-21 12:34:11.235 [INFO][4080] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-eth0" Mar 21 12:34:11.255117 containerd[1483]: 2025-03-21 12:34:11.235 [INFO][4080] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--x5fd2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"819cfe7b-19b8-4e1c-86f0-166cf2f4d650", ResourceVersion:"602", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4", Pod:"csi-node-driver-x5fd2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6f09fdbee66", MAC:"a2:8f:51:82:a0:4e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:11.255163 containerd[1483]: 2025-03-21 12:34:11.252 [INFO][4080] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" Namespace="calico-system" Pod="csi-node-driver-x5fd2" WorkloadEndpoint="localhost-k8s-csi--node--driver--x5fd2-eth0" Mar 21 12:34:11.274838 containerd[1483]: time="2025-03-21T12:34:11.274780636Z" level=info msg="connecting to shim 8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4" address="unix:///run/containerd/s/68315548bbc3784cc77788cf7e1d04dbcabf79b2dfd5b28ab46a3b6c019df56d" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:34:11.300112 systemd[1]: Started cri-containerd-8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4.scope - libcontainer container 8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4. Mar 21 12:34:11.309317 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 21 12:34:11.325935 containerd[1483]: time="2025-03-21T12:34:11.325880695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x5fd2,Uid:819cfe7b-19b8-4e1c-86f0-166cf2f4d650,Namespace:calico-system,Attempt:0,} returns sandbox id \"8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4\"" Mar 21 12:34:12.135994 containerd[1483]: time="2025-03-21T12:34:12.135934462Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:12.137498 containerd[1483]: time="2025-03-21T12:34:12.137457182Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 21 12:34:12.138878 containerd[1483]: time="2025-03-21T12:34:12.138854983Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:12.141628 containerd[1483]: time="2025-03-21T12:34:12.141556024Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:12.142260 containerd[1483]: time="2025-03-21T12:34:12.142147944Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 1.661836885s" Mar 21 12:34:12.142260 containerd[1483]: time="2025-03-21T12:34:12.142181224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 21 12:34:12.143429 containerd[1483]: time="2025-03-21T12:34:12.143405384Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 21 12:34:12.145362 containerd[1483]: time="2025-03-21T12:34:12.145299905Z" level=info msg="CreateContainer within sandbox \"6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 21 12:34:12.151850 containerd[1483]: time="2025-03-21T12:34:12.151258347Z" level=info msg="Container 303ce0e6ec2a22b8a47587ffe4e3b082754cb8e734389ce0b081c3f8140d8e79: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:12.157386 containerd[1483]: time="2025-03-21T12:34:12.157284229Z" level=info msg="CreateContainer within sandbox \"6629eec29ea7f9553315e728dc761faf954606238b9e13701cabf5393edcccd0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"303ce0e6ec2a22b8a47587ffe4e3b082754cb8e734389ce0b081c3f8140d8e79\"" Mar 21 12:34:12.158029 containerd[1483]: time="2025-03-21T12:34:12.157775669Z" level=info msg="StartContainer for \"303ce0e6ec2a22b8a47587ffe4e3b082754cb8e734389ce0b081c3f8140d8e79\"" Mar 21 12:34:12.158937 containerd[1483]: time="2025-03-21T12:34:12.158847710Z" level=info msg="connecting to shim 303ce0e6ec2a22b8a47587ffe4e3b082754cb8e734389ce0b081c3f8140d8e79" address="unix:///run/containerd/s/e216348a6d4dc2b253fbe24b4a33124a0bc9436d114c27b753fb5b771a6d088e" protocol=ttrpc version=3 Mar 21 12:34:12.179077 systemd[1]: Started cri-containerd-303ce0e6ec2a22b8a47587ffe4e3b082754cb8e734389ce0b081c3f8140d8e79.scope - libcontainer container 303ce0e6ec2a22b8a47587ffe4e3b082754cb8e734389ce0b081c3f8140d8e79. Mar 21 12:34:12.212231 containerd[1483]: time="2025-03-21T12:34:12.212186408Z" level=info msg="StartContainer for \"303ce0e6ec2a22b8a47587ffe4e3b082754cb8e734389ce0b081c3f8140d8e79\" returns successfully" Mar 21 12:34:12.274001 kubelet[2622]: I0321 12:34:12.273939 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8675c54584-755sn" podStartSLOduration=22.610352863 podStartE2EDuration="24.273901668s" podCreationTimestamp="2025-03-21 12:33:48 +0000 UTC" firstStartedPulling="2025-03-21 12:34:10.479787459 +0000 UTC m=+34.458458306" lastFinishedPulling="2025-03-21 12:34:12.143336264 +0000 UTC m=+36.122007111" observedRunningTime="2025-03-21 12:34:12.273128508 +0000 UTC m=+36.251799355" watchObservedRunningTime="2025-03-21 12:34:12.273901668 +0000 UTC m=+36.252572475" Mar 21 12:34:12.296337 systemd-networkd[1399]: calie8e191a170b: Gained IPv6LL Mar 21 12:34:13.103275 kubelet[2622]: I0321 12:34:13.103238 2622 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 12:34:13.104488 containerd[1483]: time="2025-03-21T12:34:13.104437865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-xs442,Uid:6da3189b-eb72-4225-933a-4a863afc15d4,Namespace:calico-apiserver,Attempt:0,}" Mar 21 12:34:13.104727 kubelet[2622]: E0321 12:34:13.104156 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:13.105019 containerd[1483]: time="2025-03-21T12:34:13.104990025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-czlm2,Uid:4b998b44-1450-4f89-99b4-102d22ff0e46,Namespace:kube-system,Attempt:0,}" Mar 21 12:34:13.105291 kubelet[2622]: E0321 12:34:13.105270 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:13.105978 kubelet[2622]: E0321 12:34:13.104464 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:13.109430 containerd[1483]: time="2025-03-21T12:34:13.105399786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbdffc7d-9rvwr,Uid:f07a8b17-5d2d-449c-b660-373aeac8779a,Namespace:calico-system,Attempt:0,}" Mar 21 12:34:13.109430 containerd[1483]: time="2025-03-21T12:34:13.106473546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zlht7,Uid:df7ef3b5-d48b-47d8-b992-f217ba61f745,Namespace:kube-system,Attempt:0,}" Mar 21 12:34:13.191130 systemd-networkd[1399]: cali6f09fdbee66: Gained IPv6LL Mar 21 12:34:13.276216 containerd[1483]: time="2025-03-21T12:34:13.276171359Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:13.277842 containerd[1483]: time="2025-03-21T12:34:13.277617160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 21 12:34:13.281043 containerd[1483]: time="2025-03-21T12:34:13.281014601Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:13.284487 containerd[1483]: time="2025-03-21T12:34:13.284431442Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:13.286842 containerd[1483]: time="2025-03-21T12:34:13.285523082Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.142083458s" Mar 21 12:34:13.286842 containerd[1483]: time="2025-03-21T12:34:13.285560322Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 21 12:34:13.292754 containerd[1483]: time="2025-03-21T12:34:13.292693285Z" level=info msg="CreateContainer within sandbox \"8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 21 12:34:13.319556 containerd[1483]: time="2025-03-21T12:34:13.319517133Z" level=info msg="Container 91177d09600b5f2b2346f46ea8983509b71a090048f3df9dd94124e0854f328a: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:13.356928 containerd[1483]: time="2025-03-21T12:34:13.356788225Z" level=info msg="CreateContainer within sandbox \"8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"91177d09600b5f2b2346f46ea8983509b71a090048f3df9dd94124e0854f328a\"" Mar 21 12:34:13.359569 containerd[1483]: time="2025-03-21T12:34:13.358432385Z" level=info msg="StartContainer for \"91177d09600b5f2b2346f46ea8983509b71a090048f3df9dd94124e0854f328a\"" Mar 21 12:34:13.360198 containerd[1483]: time="2025-03-21T12:34:13.360166306Z" level=info msg="connecting to shim 91177d09600b5f2b2346f46ea8983509b71a090048f3df9dd94124e0854f328a" address="unix:///run/containerd/s/68315548bbc3784cc77788cf7e1d04dbcabf79b2dfd5b28ab46a3b6c019df56d" protocol=ttrpc version=3 Mar 21 12:34:13.393579 systemd-networkd[1399]: cali961b1a050a7: Link UP Mar 21 12:34:13.393770 systemd-networkd[1399]: cali961b1a050a7: Gained carrier Mar 21 12:34:13.434636 containerd[1483]: 2025-03-21 12:34:13.161 [INFO][4250] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 21 12:34:13.434636 containerd[1483]: 2025-03-21 12:34:13.194 [INFO][4250] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8675c54584--xs442-eth0 calico-apiserver-8675c54584- calico-apiserver 6da3189b-eb72-4225-933a-4a863afc15d4 689 0 2025-03-21 12:33:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8675c54584 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8675c54584-xs442 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali961b1a050a7 [] []}} ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-" Mar 21 12:34:13.434636 containerd[1483]: 2025-03-21 12:34:13.195 [INFO][4250] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" Mar 21 12:34:13.434636 containerd[1483]: 2025-03-21 12:34:13.286 [INFO][4319] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" HandleID="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Workload="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" Mar 21 12:34:13.434522 systemd[1]: Started cri-containerd-91177d09600b5f2b2346f46ea8983509b71a090048f3df9dd94124e0854f328a.scope - libcontainer container 91177d09600b5f2b2346f46ea8983509b71a090048f3df9dd94124e0854f328a. Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.313 [INFO][4319] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" HandleID="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Workload="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c3a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8675c54584-xs442", "timestamp":"2025-03-21 12:34:13.286532163 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.318 [INFO][4319] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.318 [INFO][4319] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.318 [INFO][4319] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.322 [INFO][4319] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" host="localhost" Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.343 [INFO][4319] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.350 [INFO][4319] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.352 [INFO][4319] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.355 [INFO][4319] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.435627 containerd[1483]: 2025-03-21 12:34:13.355 [INFO][4319] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" host="localhost" Mar 21 12:34:13.435836 containerd[1483]: 2025-03-21 12:34:13.359 [INFO][4319] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b Mar 21 12:34:13.435836 containerd[1483]: 2025-03-21 12:34:13.369 [INFO][4319] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" host="localhost" Mar 21 12:34:13.435836 containerd[1483]: 2025-03-21 12:34:13.378 [INFO][4319] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" host="localhost" Mar 21 12:34:13.435836 containerd[1483]: 2025-03-21 12:34:13.379 [INFO][4319] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" host="localhost" Mar 21 12:34:13.435836 containerd[1483]: 2025-03-21 12:34:13.379 [INFO][4319] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 12:34:13.435836 containerd[1483]: 2025-03-21 12:34:13.382 [INFO][4319] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" HandleID="k8s-pod-network.ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Workload="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" Mar 21 12:34:13.436360 containerd[1483]: 2025-03-21 12:34:13.389 [INFO][4250] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8675c54584--xs442-eth0", GenerateName:"calico-apiserver-8675c54584-", Namespace:"calico-apiserver", SelfLink:"", UID:"6da3189b-eb72-4225-933a-4a863afc15d4", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8675c54584", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8675c54584-xs442", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali961b1a050a7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.436429 containerd[1483]: 2025-03-21 12:34:13.389 [INFO][4250] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" Mar 21 12:34:13.436429 containerd[1483]: 2025-03-21 12:34:13.389 [INFO][4250] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali961b1a050a7 ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" Mar 21 12:34:13.436429 containerd[1483]: 2025-03-21 12:34:13.391 [INFO][4250] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" Mar 21 12:34:13.436491 containerd[1483]: 2025-03-21 12:34:13.391 [INFO][4250] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8675c54584--xs442-eth0", GenerateName:"calico-apiserver-8675c54584-", Namespace:"calico-apiserver", SelfLink:"", UID:"6da3189b-eb72-4225-933a-4a863afc15d4", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8675c54584", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b", Pod:"calico-apiserver-8675c54584-xs442", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali961b1a050a7", MAC:"d6:cd:92:54:80:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.436855 containerd[1483]: 2025-03-21 12:34:13.410 [INFO][4250] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" Namespace="calico-apiserver" Pod="calico-apiserver-8675c54584-xs442" WorkloadEndpoint="localhost-k8s-calico--apiserver--8675c54584--xs442-eth0" Mar 21 12:34:13.455369 containerd[1483]: time="2025-03-21T12:34:13.455297776Z" level=info msg="connecting to shim ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b" address="unix:///run/containerd/s/7a87f8cc6eff51a543286778248c9e80545908eaef27b3d38fc568e11475f788" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:34:13.473169 containerd[1483]: time="2025-03-21T12:34:13.473057901Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35\" id:\"ac81a68cdaa12e5f57b204b8dbacd0261fc9dbb65446b2289cc51d0782db84a0\" pid:4369 exit_status:1 exited_at:{seconds:1742560453 nanos:472390861}" Mar 21 12:34:13.491397 systemd[1]: Started cri-containerd-ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b.scope - libcontainer container ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b. Mar 21 12:34:13.513959 systemd-networkd[1399]: cali9e6c5addaa3: Link UP Mar 21 12:34:13.514112 systemd-networkd[1399]: cali9e6c5addaa3: Gained carrier Mar 21 12:34:13.520073 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 21 12:34:13.552155 containerd[1483]: time="2025-03-21T12:34:13.551999246Z" level=info msg="StartContainer for \"91177d09600b5f2b2346f46ea8983509b71a090048f3df9dd94124e0854f328a\" returns successfully" Mar 21 12:34:13.553311 containerd[1483]: time="2025-03-21T12:34:13.553264967Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 21 12:34:13.558621 containerd[1483]: 2025-03-21 12:34:13.209 [INFO][4275] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 21 12:34:13.558621 containerd[1483]: 2025-03-21 12:34:13.247 [INFO][4275] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0 calico-kube-controllers-5cbdffc7d- calico-system f07a8b17-5d2d-449c-b660-373aeac8779a 682 0 2025-03-21 12:33:49 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5cbdffc7d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5cbdffc7d-9rvwr eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9e6c5addaa3 [] []}} ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-" Mar 21 12:34:13.558621 containerd[1483]: 2025-03-21 12:34:13.248 [INFO][4275] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" Mar 21 12:34:13.558621 containerd[1483]: 2025-03-21 12:34:13.323 [INFO][4344] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" HandleID="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Workload="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.341 [INFO][4344] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" HandleID="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Workload="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001337e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5cbdffc7d-9rvwr", "timestamp":"2025-03-21 12:34:13.323658934 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.341 [INFO][4344] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.379 [INFO][4344] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.379 [INFO][4344] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.424 [INFO][4344] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" host="localhost" Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.432 [INFO][4344] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.451 [INFO][4344] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.454 [INFO][4344] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.457 [INFO][4344] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.558811 containerd[1483]: 2025-03-21 12:34:13.457 [INFO][4344] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" host="localhost" Mar 21 12:34:13.560023 containerd[1483]: 2025-03-21 12:34:13.459 [INFO][4344] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed Mar 21 12:34:13.560023 containerd[1483]: 2025-03-21 12:34:13.479 [INFO][4344] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" host="localhost" Mar 21 12:34:13.560023 containerd[1483]: 2025-03-21 12:34:13.506 [INFO][4344] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" host="localhost" Mar 21 12:34:13.560023 containerd[1483]: 2025-03-21 12:34:13.506 [INFO][4344] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" host="localhost" Mar 21 12:34:13.560023 containerd[1483]: 2025-03-21 12:34:13.506 [INFO][4344] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 12:34:13.560023 containerd[1483]: 2025-03-21 12:34:13.506 [INFO][4344] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" HandleID="k8s-pod-network.c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Workload="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" Mar 21 12:34:13.560149 containerd[1483]: 2025-03-21 12:34:13.510 [INFO][4275] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0", GenerateName:"calico-kube-controllers-5cbdffc7d-", Namespace:"calico-system", SelfLink:"", UID:"f07a8b17-5d2d-449c-b660-373aeac8779a", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cbdffc7d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5cbdffc7d-9rvwr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9e6c5addaa3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.560204 containerd[1483]: 2025-03-21 12:34:13.511 [INFO][4275] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" Mar 21 12:34:13.560204 containerd[1483]: 2025-03-21 12:34:13.511 [INFO][4275] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e6c5addaa3 ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" Mar 21 12:34:13.560204 containerd[1483]: 2025-03-21 12:34:13.512 [INFO][4275] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" Mar 21 12:34:13.560269 containerd[1483]: 2025-03-21 12:34:13.513 [INFO][4275] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0", GenerateName:"calico-kube-controllers-5cbdffc7d-", Namespace:"calico-system", SelfLink:"", UID:"f07a8b17-5d2d-449c-b660-373aeac8779a", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cbdffc7d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed", Pod:"calico-kube-controllers-5cbdffc7d-9rvwr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9e6c5addaa3", MAC:"0e:9a:5b:ef:96:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.560316 containerd[1483]: 2025-03-21 12:34:13.553 [INFO][4275] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" Namespace="calico-system" Pod="calico-kube-controllers-5cbdffc7d-9rvwr" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5cbdffc7d--9rvwr-eth0" Mar 21 12:34:13.560316 containerd[1483]: time="2025-03-21T12:34:13.558996888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8675c54584-xs442,Uid:6da3189b-eb72-4225-933a-4a863afc15d4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b\"" Mar 21 12:34:13.565040 containerd[1483]: time="2025-03-21T12:34:13.563982130Z" level=info msg="CreateContainer within sandbox \"ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 21 12:34:13.578085 containerd[1483]: time="2025-03-21T12:34:13.577998974Z" level=info msg="Container 16f6437017cfd7a72f09dbc30d564242aaa8c0ae7a470e364e915d05267578c8: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:13.590068 containerd[1483]: time="2025-03-21T12:34:13.590000378Z" level=info msg="CreateContainer within sandbox \"ebb8723a47930669698e73901f52de79369d410f1c2dbd550503452ec32bc06b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"16f6437017cfd7a72f09dbc30d564242aaa8c0ae7a470e364e915d05267578c8\"" Mar 21 12:34:13.590730 containerd[1483]: time="2025-03-21T12:34:13.590697058Z" level=info msg="StartContainer for \"16f6437017cfd7a72f09dbc30d564242aaa8c0ae7a470e364e915d05267578c8\"" Mar 21 12:34:13.593502 containerd[1483]: time="2025-03-21T12:34:13.593360459Z" level=info msg="connecting to shim 16f6437017cfd7a72f09dbc30d564242aaa8c0ae7a470e364e915d05267578c8" address="unix:///run/containerd/s/7a87f8cc6eff51a543286778248c9e80545908eaef27b3d38fc568e11475f788" protocol=ttrpc version=3 Mar 21 12:34:13.601058 containerd[1483]: time="2025-03-21T12:34:13.600815302Z" level=info msg="connecting to shim c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed" address="unix:///run/containerd/s/c0c49b170506a53e551b487029491b8661ac0bbf09b75ace3b1f8fdac9d78d71" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:34:13.602339 systemd-networkd[1399]: calibd86d596542: Link UP Mar 21 12:34:13.602500 systemd-networkd[1399]: calibd86d596542: Gained carrier Mar 21 12:34:13.625132 containerd[1483]: 2025-03-21 12:34:13.208 [INFO][4263] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 21 12:34:13.625132 containerd[1483]: 2025-03-21 12:34:13.246 [INFO][4263] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--czlm2-eth0 coredns-668d6bf9bc- kube-system 4b998b44-1450-4f89-99b4-102d22ff0e46 688 0 2025-03-21 12:33:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-czlm2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibd86d596542 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-" Mar 21 12:34:13.625132 containerd[1483]: 2025-03-21 12:34:13.246 [INFO][4263] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" Mar 21 12:34:13.625132 containerd[1483]: 2025-03-21 12:34:13.325 [INFO][4337] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" HandleID="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Workload="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.345 [INFO][4337] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" HandleID="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Workload="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000304dc0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-czlm2", "timestamp":"2025-03-21 12:34:13.325549735 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.345 [INFO][4337] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.506 [INFO][4337] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.506 [INFO][4337] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.524 [INFO][4337] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" host="localhost" Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.556 [INFO][4337] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.562 [INFO][4337] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.566 [INFO][4337] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.572 [INFO][4337] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.627016 containerd[1483]: 2025-03-21 12:34:13.572 [INFO][4337] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" host="localhost" Mar 21 12:34:13.625443 systemd[1]: Started cri-containerd-16f6437017cfd7a72f09dbc30d564242aaa8c0ae7a470e364e915d05267578c8.scope - libcontainer container 16f6437017cfd7a72f09dbc30d564242aaa8c0ae7a470e364e915d05267578c8. Mar 21 12:34:13.627359 containerd[1483]: 2025-03-21 12:34:13.575 [INFO][4337] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639 Mar 21 12:34:13.627359 containerd[1483]: 2025-03-21 12:34:13.580 [INFO][4337] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" host="localhost" Mar 21 12:34:13.627359 containerd[1483]: 2025-03-21 12:34:13.590 [INFO][4337] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" host="localhost" Mar 21 12:34:13.627359 containerd[1483]: 2025-03-21 12:34:13.590 [INFO][4337] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" host="localhost" Mar 21 12:34:13.627359 containerd[1483]: 2025-03-21 12:34:13.590 [INFO][4337] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 12:34:13.627359 containerd[1483]: 2025-03-21 12:34:13.591 [INFO][4337] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" HandleID="k8s-pod-network.4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Workload="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" Mar 21 12:34:13.628386 containerd[1483]: 2025-03-21 12:34:13.596 [INFO][4263] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--czlm2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4b998b44-1450-4f89-99b4-102d22ff0e46", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-czlm2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibd86d596542", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.628457 containerd[1483]: 2025-03-21 12:34:13.597 [INFO][4263] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" Mar 21 12:34:13.628457 containerd[1483]: 2025-03-21 12:34:13.597 [INFO][4263] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibd86d596542 ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" Mar 21 12:34:13.628457 containerd[1483]: 2025-03-21 12:34:13.601 [INFO][4263] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" Mar 21 12:34:13.628516 containerd[1483]: 2025-03-21 12:34:13.602 [INFO][4263] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--czlm2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4b998b44-1450-4f89-99b4-102d22ff0e46", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639", Pod:"coredns-668d6bf9bc-czlm2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibd86d596542", MAC:"2a:16:0a:7e:7b:81", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.628516 containerd[1483]: 2025-03-21 12:34:13.615 [INFO][4263] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" Namespace="kube-system" Pod="coredns-668d6bf9bc-czlm2" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--czlm2-eth0" Mar 21 12:34:13.637438 containerd[1483]: time="2025-03-21T12:34:13.637301313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c37e81752a3b0837b3c9f9566e8fcab42c39fd5c9545318153a7adf87ca0f35\" id:\"f78632ef033ef2e3783194b228f2889857b017eeb6af39d52f2ee85e30068e68\" pid:4477 exit_status:1 exited_at:{seconds:1742560453 nanos:636883953}" Mar 21 12:34:13.665408 systemd[1]: Started cri-containerd-c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed.scope - libcontainer container c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed. Mar 21 12:34:13.676828 containerd[1483]: time="2025-03-21T12:34:13.676776766Z" level=info msg="connecting to shim 4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639" address="unix:///run/containerd/s/55872094eede1d4e68296e7f1a6ac691545d0a0ea26fb77aee8b2f68b9a416de" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:34:13.694017 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 21 12:34:13.697304 systemd-networkd[1399]: cali74cf207c441: Link UP Mar 21 12:34:13.697471 systemd-networkd[1399]: cali74cf207c441: Gained carrier Mar 21 12:34:13.699628 containerd[1483]: time="2025-03-21T12:34:13.699142053Z" level=info msg="StartContainer for \"16f6437017cfd7a72f09dbc30d564242aaa8c0ae7a470e364e915d05267578c8\" returns successfully" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.233 [INFO][4291] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.261 [INFO][4291] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--zlht7-eth0 coredns-668d6bf9bc- kube-system df7ef3b5-d48b-47d8-b992-f217ba61f745 686 0 2025-03-21 12:33:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-zlht7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali74cf207c441 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.261 [INFO][4291] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.350 [INFO][4351] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" HandleID="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Workload="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.421 [INFO][4351] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" HandleID="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Workload="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000260690), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-zlht7", "timestamp":"2025-03-21 12:34:13.350256503 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.421 [INFO][4351] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.590 [INFO][4351] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.591 [INFO][4351] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.629 [INFO][4351] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.655 [INFO][4351] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.663 [INFO][4351] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.667 [INFO][4351] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.673 [INFO][4351] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.673 [INFO][4351] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.675 [INFO][4351] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3 Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.681 [INFO][4351] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.690 [INFO][4351] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.690 [INFO][4351] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" host="localhost" Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.690 [INFO][4351] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 12:34:13.714032 containerd[1483]: 2025-03-21 12:34:13.690 [INFO][4351] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" HandleID="k8s-pod-network.1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Workload="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" Mar 21 12:34:13.715032 containerd[1483]: 2025-03-21 12:34:13.695 [INFO][4291] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--zlht7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"df7ef3b5-d48b-47d8-b992-f217ba61f745", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-zlht7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali74cf207c441", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.715032 containerd[1483]: 2025-03-21 12:34:13.695 [INFO][4291] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" Mar 21 12:34:13.715032 containerd[1483]: 2025-03-21 12:34:13.695 [INFO][4291] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali74cf207c441 ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" Mar 21 12:34:13.715032 containerd[1483]: 2025-03-21 12:34:13.697 [INFO][4291] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" Mar 21 12:34:13.715032 containerd[1483]: 2025-03-21 12:34:13.697 [INFO][4291] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--zlht7-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"df7ef3b5-d48b-47d8-b992-f217ba61f745", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 12, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3", Pod:"coredns-668d6bf9bc-zlht7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali74cf207c441", MAC:"d2:3d:4b:1c:5d:2d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 12:34:13.715032 containerd[1483]: 2025-03-21 12:34:13.709 [INFO][4291] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" Namespace="kube-system" Pod="coredns-668d6bf9bc-zlht7" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--zlht7-eth0" Mar 21 12:34:13.735097 systemd[1]: Started cri-containerd-4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639.scope - libcontainer container 4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639. Mar 21 12:34:13.743202 containerd[1483]: time="2025-03-21T12:34:13.743159786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbdffc7d-9rvwr,Uid:f07a8b17-5d2d-449c-b660-373aeac8779a,Namespace:calico-system,Attempt:0,} returns sandbox id \"c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed\"" Mar 21 12:34:13.756163 containerd[1483]: time="2025-03-21T12:34:13.756114871Z" level=info msg="connecting to shim 1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3" address="unix:///run/containerd/s/a1089ebc1452d749540f85e985e2ab4c7da5972d5aab2aea87688a84b0e99c48" namespace=k8s.io protocol=ttrpc version=3 Mar 21 12:34:13.756970 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 21 12:34:13.788846 containerd[1483]: time="2025-03-21T12:34:13.788803681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-czlm2,Uid:4b998b44-1450-4f89-99b4-102d22ff0e46,Namespace:kube-system,Attempt:0,} returns sandbox id \"4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639\"" Mar 21 12:34:13.789148 systemd[1]: Started cri-containerd-1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3.scope - libcontainer container 1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3. Mar 21 12:34:13.790385 kubelet[2622]: E0321 12:34:13.790365 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:13.794348 containerd[1483]: time="2025-03-21T12:34:13.794272243Z" level=info msg="CreateContainer within sandbox \"4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 21 12:34:13.810146 systemd-resolved[1322]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 21 12:34:13.817624 containerd[1483]: time="2025-03-21T12:34:13.817578690Z" level=info msg="Container d1f0e826305e5d0501274612cc45488eec5e7fc0fdf29c3e5d273e822c48e79a: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:13.826080 containerd[1483]: time="2025-03-21T12:34:13.825941853Z" level=info msg="CreateContainer within sandbox \"4590148dee4f495882e5213b42fb928b6137f6270de7b9067f0db3fa372e8639\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d1f0e826305e5d0501274612cc45488eec5e7fc0fdf29c3e5d273e822c48e79a\"" Mar 21 12:34:13.827037 containerd[1483]: time="2025-03-21T12:34:13.826968493Z" level=info msg="StartContainer for \"d1f0e826305e5d0501274612cc45488eec5e7fc0fdf29c3e5d273e822c48e79a\"" Mar 21 12:34:13.827948 containerd[1483]: time="2025-03-21T12:34:13.827894413Z" level=info msg="connecting to shim d1f0e826305e5d0501274612cc45488eec5e7fc0fdf29c3e5d273e822c48e79a" address="unix:///run/containerd/s/55872094eede1d4e68296e7f1a6ac691545d0a0ea26fb77aee8b2f68b9a416de" protocol=ttrpc version=3 Mar 21 12:34:13.844183 containerd[1483]: time="2025-03-21T12:34:13.844146098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zlht7,Uid:df7ef3b5-d48b-47d8-b992-f217ba61f745,Namespace:kube-system,Attempt:0,} returns sandbox id \"1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3\"" Mar 21 12:34:13.845664 kubelet[2622]: E0321 12:34:13.844957 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:13.850200 containerd[1483]: time="2025-03-21T12:34:13.850168660Z" level=info msg="CreateContainer within sandbox \"1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 21 12:34:13.852698 systemd[1]: Started cri-containerd-d1f0e826305e5d0501274612cc45488eec5e7fc0fdf29c3e5d273e822c48e79a.scope - libcontainer container d1f0e826305e5d0501274612cc45488eec5e7fc0fdf29c3e5d273e822c48e79a. Mar 21 12:34:13.866165 containerd[1483]: time="2025-03-21T12:34:13.865977505Z" level=info msg="Container e98772f512e722bcb23b9719b9a6503c68afc3f7ff7b165b4ab160587c7e879c: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:13.878074 containerd[1483]: time="2025-03-21T12:34:13.877798469Z" level=info msg="CreateContainer within sandbox \"1a134800be61307b9567c80e4758d4171066ed760d3dd4c0a53ebe1f6fc8f8f3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e98772f512e722bcb23b9719b9a6503c68afc3f7ff7b165b4ab160587c7e879c\"" Mar 21 12:34:13.878882 containerd[1483]: time="2025-03-21T12:34:13.878404309Z" level=info msg="StartContainer for \"e98772f512e722bcb23b9719b9a6503c68afc3f7ff7b165b4ab160587c7e879c\"" Mar 21 12:34:13.880375 containerd[1483]: time="2025-03-21T12:34:13.880058750Z" level=info msg="connecting to shim e98772f512e722bcb23b9719b9a6503c68afc3f7ff7b165b4ab160587c7e879c" address="unix:///run/containerd/s/a1089ebc1452d749540f85e985e2ab4c7da5972d5aab2aea87688a84b0e99c48" protocol=ttrpc version=3 Mar 21 12:34:13.902094 containerd[1483]: time="2025-03-21T12:34:13.902054437Z" level=info msg="StartContainer for \"d1f0e826305e5d0501274612cc45488eec5e7fc0fdf29c3e5d273e822c48e79a\" returns successfully" Mar 21 12:34:13.914748 systemd[1]: Started cri-containerd-e98772f512e722bcb23b9719b9a6503c68afc3f7ff7b165b4ab160587c7e879c.scope - libcontainer container e98772f512e722bcb23b9719b9a6503c68afc3f7ff7b165b4ab160587c7e879c. Mar 21 12:34:13.954282 containerd[1483]: time="2025-03-21T12:34:13.951675492Z" level=info msg="StartContainer for \"e98772f512e722bcb23b9719b9a6503c68afc3f7ff7b165b4ab160587c7e879c\" returns successfully" Mar 21 12:34:14.262831 kubelet[2622]: I0321 12:34:14.262364 2622 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 12:34:14.267036 kubelet[2622]: E0321 12:34:14.263421 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:14.275069 kubelet[2622]: E0321 12:34:14.275038 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:14.284349 kubelet[2622]: E0321 12:34:14.284308 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:14.289469 kubelet[2622]: I0321 12:34:14.287726 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-zlht7" podStartSLOduration=32.287711752 podStartE2EDuration="32.287711752s" podCreationTimestamp="2025-03-21 12:33:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 12:34:14.287380872 +0000 UTC m=+38.266051719" watchObservedRunningTime="2025-03-21 12:34:14.287711752 +0000 UTC m=+38.266382599" Mar 21 12:34:14.294440 kubelet[2622]: E0321 12:34:14.294406 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:14.307268 kubelet[2622]: I0321 12:34:14.307197 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-czlm2" podStartSLOduration=32.307182078 podStartE2EDuration="32.307182078s" podCreationTimestamp="2025-03-21 12:33:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 12:34:14.306782158 +0000 UTC m=+38.285453005" watchObservedRunningTime="2025-03-21 12:34:14.307182078 +0000 UTC m=+38.285852885" Mar 21 12:34:14.347582 kubelet[2622]: I0321 12:34:14.345939 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8675c54584-xs442" podStartSLOduration=26.34590313 podStartE2EDuration="26.34590313s" podCreationTimestamp="2025-03-21 12:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 12:34:14.341206248 +0000 UTC m=+38.319877095" watchObservedRunningTime="2025-03-21 12:34:14.34590313 +0000 UTC m=+38.324573977" Mar 21 12:34:14.599684 systemd-networkd[1399]: cali961b1a050a7: Gained IPv6LL Mar 21 12:34:14.701913 containerd[1483]: time="2025-03-21T12:34:14.701865835Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:14.703292 containerd[1483]: time="2025-03-21T12:34:14.703228995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 21 12:34:14.704181 containerd[1483]: time="2025-03-21T12:34:14.704153515Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:14.706462 containerd[1483]: time="2025-03-21T12:34:14.706241356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:14.707632 containerd[1483]: time="2025-03-21T12:34:14.707590436Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.154260069s" Mar 21 12:34:14.707632 containerd[1483]: time="2025-03-21T12:34:14.707627876Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 21 12:34:14.710813 containerd[1483]: time="2025-03-21T12:34:14.710593957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 21 12:34:14.714201 containerd[1483]: time="2025-03-21T12:34:14.712739438Z" level=info msg="CreateContainer within sandbox \"8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 21 12:34:14.720207 containerd[1483]: time="2025-03-21T12:34:14.720177040Z" level=info msg="Container 5723d383aeae97b2f3821b71d65e3b92d28523893659c37a45401d02ff277a88: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:14.725298 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3130524428.mount: Deactivated successfully. Mar 21 12:34:14.735076 containerd[1483]: time="2025-03-21T12:34:14.735029245Z" level=info msg="CreateContainer within sandbox \"8dfdc740a4588fe200921fef8152287dcab9c96e8c1425db11fa85481342e2b4\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5723d383aeae97b2f3821b71d65e3b92d28523893659c37a45401d02ff277a88\"" Mar 21 12:34:14.736869 containerd[1483]: time="2025-03-21T12:34:14.735566765Z" level=info msg="StartContainer for \"5723d383aeae97b2f3821b71d65e3b92d28523893659c37a45401d02ff277a88\"" Mar 21 12:34:14.737483 containerd[1483]: time="2025-03-21T12:34:14.737452205Z" level=info msg="connecting to shim 5723d383aeae97b2f3821b71d65e3b92d28523893659c37a45401d02ff277a88" address="unix:///run/containerd/s/68315548bbc3784cc77788cf7e1d04dbcabf79b2dfd5b28ab46a3b6c019df56d" protocol=ttrpc version=3 Mar 21 12:34:14.758076 systemd[1]: Started cri-containerd-5723d383aeae97b2f3821b71d65e3b92d28523893659c37a45401d02ff277a88.scope - libcontainer container 5723d383aeae97b2f3821b71d65e3b92d28523893659c37a45401d02ff277a88. Mar 21 12:34:14.792011 containerd[1483]: time="2025-03-21T12:34:14.791571461Z" level=info msg="StartContainer for \"5723d383aeae97b2f3821b71d65e3b92d28523893659c37a45401d02ff277a88\" returns successfully" Mar 21 12:34:15.047297 systemd-networkd[1399]: cali74cf207c441: Gained IPv6LL Mar 21 12:34:15.054091 kernel: bpftool[4858]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 21 12:34:15.123471 systemd[1]: Started sshd@11-10.0.0.87:22-10.0.0.1:48208.service - OpenSSH per-connection server daemon (10.0.0.1:48208). Mar 21 12:34:15.196666 kubelet[2622]: I0321 12:34:15.196605 2622 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 21 12:34:15.200968 kubelet[2622]: I0321 12:34:15.200931 2622 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 21 12:34:15.205906 sshd[4860]: Accepted publickey for core from 10.0.0.1 port 48208 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:15.211501 sshd-session[4860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:15.222204 systemd-logind[1467]: New session 12 of user core. Mar 21 12:34:15.227102 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 21 12:34:15.276075 systemd-networkd[1399]: vxlan.calico: Link UP Mar 21 12:34:15.276083 systemd-networkd[1399]: vxlan.calico: Gained carrier Mar 21 12:34:15.325492 kubelet[2622]: I0321 12:34:15.324109 2622 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 12:34:15.325693 kubelet[2622]: E0321 12:34:15.324239 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:15.325693 kubelet[2622]: E0321 12:34:15.324694 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:15.368773 systemd-networkd[1399]: cali9e6c5addaa3: Gained IPv6LL Mar 21 12:34:15.467621 sshd[4886]: Connection closed by 10.0.0.1 port 48208 Mar 21 12:34:15.468035 sshd-session[4860]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:15.479888 systemd[1]: sshd@11-10.0.0.87:22-10.0.0.1:48208.service: Deactivated successfully. Mar 21 12:34:15.483502 systemd[1]: session-12.scope: Deactivated successfully. Mar 21 12:34:15.485521 systemd-logind[1467]: Session 12 logged out. Waiting for processes to exit. Mar 21 12:34:15.487585 systemd[1]: Started sshd@12-10.0.0.87:22-10.0.0.1:48224.service - OpenSSH per-connection server daemon (10.0.0.1:48224). Mar 21 12:34:15.491444 systemd-logind[1467]: Removed session 12. Mar 21 12:34:15.546290 sshd[4949]: Accepted publickey for core from 10.0.0.1 port 48224 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:15.547598 sshd-session[4949]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:15.552713 systemd-logind[1467]: New session 13 of user core. Mar 21 12:34:15.559215 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 21 12:34:15.624768 systemd-networkd[1399]: calibd86d596542: Gained IPv6LL Mar 21 12:34:15.834703 sshd[4980]: Connection closed by 10.0.0.1 port 48224 Mar 21 12:34:15.835643 sshd-session[4949]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:15.849368 systemd[1]: sshd@12-10.0.0.87:22-10.0.0.1:48224.service: Deactivated successfully. Mar 21 12:34:15.852480 systemd[1]: session-13.scope: Deactivated successfully. Mar 21 12:34:15.856531 systemd-logind[1467]: Session 13 logged out. Waiting for processes to exit. Mar 21 12:34:15.858792 systemd[1]: Started sshd@13-10.0.0.87:22-10.0.0.1:48230.service - OpenSSH per-connection server daemon (10.0.0.1:48230). Mar 21 12:34:15.862987 systemd-logind[1467]: Removed session 13. Mar 21 12:34:15.928249 sshd[5012]: Accepted publickey for core from 10.0.0.1 port 48230 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:15.929746 sshd-session[5012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:15.934969 systemd-logind[1467]: New session 14 of user core. Mar 21 12:34:15.942175 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 21 12:34:16.187113 sshd[5015]: Connection closed by 10.0.0.1 port 48230 Mar 21 12:34:16.187584 sshd-session[5012]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:16.191166 systemd[1]: sshd@13-10.0.0.87:22-10.0.0.1:48230.service: Deactivated successfully. Mar 21 12:34:16.193542 systemd[1]: session-14.scope: Deactivated successfully. Mar 21 12:34:16.196074 systemd-logind[1467]: Session 14 logged out. Waiting for processes to exit. Mar 21 12:34:16.197831 systemd-logind[1467]: Removed session 14. Mar 21 12:34:16.257357 containerd[1483]: time="2025-03-21T12:34:16.256668586Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:16.257357 containerd[1483]: time="2025-03-21T12:34:16.257178307Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 21 12:34:16.257972 containerd[1483]: time="2025-03-21T12:34:16.257910507Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:16.260335 containerd[1483]: time="2025-03-21T12:34:16.260255427Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 12:34:16.261275 containerd[1483]: time="2025-03-21T12:34:16.260708107Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 1.54988679s" Mar 21 12:34:16.261275 containerd[1483]: time="2025-03-21T12:34:16.260740428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 21 12:34:16.270632 containerd[1483]: time="2025-03-21T12:34:16.270587070Z" level=info msg="CreateContainer within sandbox \"c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 21 12:34:16.275447 containerd[1483]: time="2025-03-21T12:34:16.275412831Z" level=info msg="Container e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd: CDI devices from CRI Config.CDIDevices: []" Mar 21 12:34:16.282074 containerd[1483]: time="2025-03-21T12:34:16.282030353Z" level=info msg="CreateContainer within sandbox \"c0ef004d057d3b8a7c419381ad4817bcaa092a7d946971c84df6b6d734e81aed\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd\"" Mar 21 12:34:16.282485 containerd[1483]: time="2025-03-21T12:34:16.282460993Z" level=info msg="StartContainer for \"e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd\"" Mar 21 12:34:16.283744 containerd[1483]: time="2025-03-21T12:34:16.283581873Z" level=info msg="connecting to shim e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd" address="unix:///run/containerd/s/c0c49b170506a53e551b487029491b8661ac0bbf09b75ace3b1f8fdac9d78d71" protocol=ttrpc version=3 Mar 21 12:34:16.304070 systemd[1]: Started cri-containerd-e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd.scope - libcontainer container e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd. Mar 21 12:34:16.328893 kubelet[2622]: E0321 12:34:16.328667 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:16.330057 kubelet[2622]: E0321 12:34:16.329805 2622 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 21 12:34:16.343494 containerd[1483]: time="2025-03-21T12:34:16.343244009Z" level=info msg="StartContainer for \"e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd\" returns successfully" Mar 21 12:34:16.455329 systemd-networkd[1399]: vxlan.calico: Gained IPv6LL Mar 21 12:34:17.357609 kubelet[2622]: I0321 12:34:17.357370 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-x5fd2" podStartSLOduration=24.975392524 podStartE2EDuration="28.357350666s" podCreationTimestamp="2025-03-21 12:33:49 +0000 UTC" firstStartedPulling="2025-03-21 12:34:11.327211015 +0000 UTC m=+35.305881862" lastFinishedPulling="2025-03-21 12:34:14.709169157 +0000 UTC m=+38.687840004" observedRunningTime="2025-03-21 12:34:15.346780579 +0000 UTC m=+39.325451426" watchObservedRunningTime="2025-03-21 12:34:17.357350666 +0000 UTC m=+41.336021513" Mar 21 12:34:17.358224 kubelet[2622]: I0321 12:34:17.357837 2622 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5cbdffc7d-9rvwr" podStartSLOduration=25.841969466 podStartE2EDuration="28.357830707s" podCreationTimestamp="2025-03-21 12:33:49 +0000 UTC" firstStartedPulling="2025-03-21 12:34:13.746032947 +0000 UTC m=+37.724703794" lastFinishedPulling="2025-03-21 12:34:16.261894228 +0000 UTC m=+40.240565035" observedRunningTime="2025-03-21 12:34:17.356798626 +0000 UTC m=+41.335469473" watchObservedRunningTime="2025-03-21 12:34:17.357830707 +0000 UTC m=+41.336501554" Mar 21 12:34:17.386166 containerd[1483]: time="2025-03-21T12:34:17.386126833Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e6c96db6642a8bd48e7c85c039b4f192bf23a9c2fe9e87522abb5b029b28e1bd\" id:\"e6a5c0225e1b6d373516bbb0a82bb3eeba4089fc8c9f213f8c7ec197d2e958a1\" pid:5079 exited_at:{seconds:1742560457 nanos:385721033}" Mar 21 12:34:21.203352 systemd[1]: Started sshd@14-10.0.0.87:22-10.0.0.1:48242.service - OpenSSH per-connection server daemon (10.0.0.1:48242). Mar 21 12:34:21.265192 sshd[5100]: Accepted publickey for core from 10.0.0.1 port 48242 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:21.266564 sshd-session[5100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:21.273457 systemd-logind[1467]: New session 15 of user core. Mar 21 12:34:21.280059 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 21 12:34:21.514093 sshd[5102]: Connection closed by 10.0.0.1 port 48242 Mar 21 12:34:21.514440 sshd-session[5100]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:21.523145 systemd[1]: sshd@14-10.0.0.87:22-10.0.0.1:48242.service: Deactivated successfully. Mar 21 12:34:21.524834 systemd[1]: session-15.scope: Deactivated successfully. Mar 21 12:34:21.526028 systemd-logind[1467]: Session 15 logged out. Waiting for processes to exit. Mar 21 12:34:21.527262 systemd[1]: Started sshd@15-10.0.0.87:22-10.0.0.1:48256.service - OpenSSH per-connection server daemon (10.0.0.1:48256). Mar 21 12:34:21.527885 systemd-logind[1467]: Removed session 15. Mar 21 12:34:21.588375 sshd[5114]: Accepted publickey for core from 10.0.0.1 port 48256 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:21.589575 sshd-session[5114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:21.594401 systemd-logind[1467]: New session 16 of user core. Mar 21 12:34:21.605065 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 21 12:34:21.793265 sshd[5117]: Connection closed by 10.0.0.1 port 48256 Mar 21 12:34:21.794728 sshd-session[5114]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:21.805033 systemd[1]: sshd@15-10.0.0.87:22-10.0.0.1:48256.service: Deactivated successfully. Mar 21 12:34:21.806428 systemd[1]: session-16.scope: Deactivated successfully. Mar 21 12:34:21.807159 systemd-logind[1467]: Session 16 logged out. Waiting for processes to exit. Mar 21 12:34:21.808853 systemd[1]: Started sshd@16-10.0.0.87:22-10.0.0.1:48264.service - OpenSSH per-connection server daemon (10.0.0.1:48264). Mar 21 12:34:21.811570 systemd-logind[1467]: Removed session 16. Mar 21 12:34:21.860309 sshd[5128]: Accepted publickey for core from 10.0.0.1 port 48264 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:21.861497 sshd-session[5128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:21.865880 systemd-logind[1467]: New session 17 of user core. Mar 21 12:34:21.877051 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 21 12:34:22.596577 sshd[5131]: Connection closed by 10.0.0.1 port 48264 Mar 21 12:34:22.597163 sshd-session[5128]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:22.607416 systemd[1]: sshd@16-10.0.0.87:22-10.0.0.1:48264.service: Deactivated successfully. Mar 21 12:34:22.610615 systemd[1]: session-17.scope: Deactivated successfully. Mar 21 12:34:22.613443 systemd-logind[1467]: Session 17 logged out. Waiting for processes to exit. Mar 21 12:34:22.615906 systemd[1]: Started sshd@17-10.0.0.87:22-10.0.0.1:37300.service - OpenSSH per-connection server daemon (10.0.0.1:37300). Mar 21 12:34:22.618502 systemd-logind[1467]: Removed session 17. Mar 21 12:34:22.670288 sshd[5149]: Accepted publickey for core from 10.0.0.1 port 37300 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:22.671437 sshd-session[5149]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:22.675633 systemd-logind[1467]: New session 18 of user core. Mar 21 12:34:22.689061 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 21 12:34:22.975708 sshd[5154]: Connection closed by 10.0.0.1 port 37300 Mar 21 12:34:22.975429 sshd-session[5149]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:22.983768 systemd[1]: sshd@17-10.0.0.87:22-10.0.0.1:37300.service: Deactivated successfully. Mar 21 12:34:22.987712 systemd[1]: session-18.scope: Deactivated successfully. Mar 21 12:34:22.989283 systemd-logind[1467]: Session 18 logged out. Waiting for processes to exit. Mar 21 12:34:22.991140 systemd[1]: Started sshd@18-10.0.0.87:22-10.0.0.1:37304.service - OpenSSH per-connection server daemon (10.0.0.1:37304). Mar 21 12:34:22.993817 systemd-logind[1467]: Removed session 18. Mar 21 12:34:23.038650 sshd[5165]: Accepted publickey for core from 10.0.0.1 port 37304 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:23.039842 sshd-session[5165]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:23.044615 systemd-logind[1467]: New session 19 of user core. Mar 21 12:34:23.056091 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 21 12:34:23.212824 sshd[5168]: Connection closed by 10.0.0.1 port 37304 Mar 21 12:34:23.213176 sshd-session[5165]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:23.216550 systemd[1]: sshd@18-10.0.0.87:22-10.0.0.1:37304.service: Deactivated successfully. Mar 21 12:34:23.218393 systemd[1]: session-19.scope: Deactivated successfully. Mar 21 12:34:23.219816 systemd-logind[1467]: Session 19 logged out. Waiting for processes to exit. Mar 21 12:34:23.221133 systemd-logind[1467]: Removed session 19. Mar 21 12:34:28.224558 systemd[1]: Started sshd@19-10.0.0.87:22-10.0.0.1:37318.service - OpenSSH per-connection server daemon (10.0.0.1:37318). Mar 21 12:34:28.290128 sshd[5191]: Accepted publickey for core from 10.0.0.1 port 37318 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:28.291423 sshd-session[5191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:28.295250 systemd-logind[1467]: New session 20 of user core. Mar 21 12:34:28.305055 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 21 12:34:28.457113 sshd[5193]: Connection closed by 10.0.0.1 port 37318 Mar 21 12:34:28.457455 sshd-session[5191]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:28.460864 systemd[1]: sshd@19-10.0.0.87:22-10.0.0.1:37318.service: Deactivated successfully. Mar 21 12:34:28.463023 systemd[1]: session-20.scope: Deactivated successfully. Mar 21 12:34:28.463935 systemd-logind[1467]: Session 20 logged out. Waiting for processes to exit. Mar 21 12:34:28.464911 systemd-logind[1467]: Removed session 20. Mar 21 12:34:29.613764 kubelet[2622]: I0321 12:34:29.613720 2622 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 12:34:33.472953 systemd[1]: Started sshd@20-10.0.0.87:22-10.0.0.1:57840.service - OpenSSH per-connection server daemon (10.0.0.1:57840). Mar 21 12:34:33.516745 sshd[5209]: Accepted publickey for core from 10.0.0.1 port 57840 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:33.518791 sshd-session[5209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:33.523885 systemd-logind[1467]: New session 21 of user core. Mar 21 12:34:33.532152 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 21 12:34:33.719089 sshd[5211]: Connection closed by 10.0.0.1 port 57840 Mar 21 12:34:33.719634 sshd-session[5209]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:33.724336 systemd[1]: sshd@20-10.0.0.87:22-10.0.0.1:57840.service: Deactivated successfully. Mar 21 12:34:33.728052 systemd-logind[1467]: Session 21 logged out. Waiting for processes to exit. Mar 21 12:34:33.728190 systemd[1]: session-21.scope: Deactivated successfully. Mar 21 12:34:33.729311 systemd-logind[1467]: Removed session 21. Mar 21 12:34:38.734321 systemd[1]: Started sshd@21-10.0.0.87:22-10.0.0.1:57852.service - OpenSSH per-connection server daemon (10.0.0.1:57852). Mar 21 12:34:38.781634 sshd[5235]: Accepted publickey for core from 10.0.0.1 port 57852 ssh2: RSA SHA256:MdsOSlIGNpcftqwP7ll+xX3Rmkua/0DX/UznjsKKr2Y Mar 21 12:34:38.782650 sshd-session[5235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 12:34:38.787145 systemd-logind[1467]: New session 22 of user core. Mar 21 12:34:38.792084 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 21 12:34:38.923742 sshd[5237]: Connection closed by 10.0.0.1 port 57852 Mar 21 12:34:38.924141 sshd-session[5235]: pam_unix(sshd:session): session closed for user core Mar 21 12:34:38.927352 systemd[1]: sshd@21-10.0.0.87:22-10.0.0.1:57852.service: Deactivated successfully. Mar 21 12:34:38.929203 systemd[1]: session-22.scope: Deactivated successfully. Mar 21 12:34:38.929798 systemd-logind[1467]: Session 22 logged out. Waiting for processes to exit. Mar 21 12:34:38.930638 systemd-logind[1467]: Removed session 22.