Mar 20 18:03:45.877570 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 20 13:16:44 -00 2025 Mar 20 18:03:45.877591 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=30d38910dcb9abcb2ae1fb8c4b62196472dfae1a70f494441b86ff0de2ee88c9 Mar 20 18:03:45.877602 kernel: BIOS-provided physical RAM map: Mar 20 18:03:45.877609 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 20 18:03:45.877624 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 20 18:03:45.877631 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 20 18:03:45.877639 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable Mar 20 18:03:45.877646 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved Mar 20 18:03:45.877659 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 20 18:03:45.877666 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 20 18:03:45.877682 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 20 18:03:45.877695 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 20 18:03:45.877708 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 20 18:03:45.877722 kernel: NX (Execute Disable) protection: active Mar 20 18:03:45.877736 kernel: APIC: Static calls initialized Mar 20 18:03:45.877746 kernel: SMBIOS 2.8 present. Mar 20 18:03:45.877765 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 Mar 20 18:03:45.877774 kernel: Hypervisor detected: KVM Mar 20 18:03:45.877781 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 20 18:03:45.877788 kernel: kvm-clock: using sched offset of 2334145352 cycles Mar 20 18:03:45.877795 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 20 18:03:45.877803 kernel: tsc: Detected 2794.746 MHz processor Mar 20 18:03:45.877811 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 20 18:03:45.877818 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 20 18:03:45.877825 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 Mar 20 18:03:45.877835 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 20 18:03:45.877843 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 20 18:03:45.877850 kernel: Using GB pages for direct mapping Mar 20 18:03:45.877857 kernel: ACPI: Early table checksum verification disabled Mar 20 18:03:45.877865 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) Mar 20 18:03:45.877872 kernel: ACPI: RSDT 0x000000009CFE2408 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 18:03:45.877880 kernel: ACPI: FACP 0x000000009CFE21E8 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 18:03:45.877887 kernel: ACPI: DSDT 0x000000009CFE0040 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 18:03:45.877894 kernel: ACPI: FACS 0x000000009CFE0000 000040 Mar 20 18:03:45.877910 kernel: ACPI: APIC 0x000000009CFE22DC 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 18:03:45.877940 kernel: ACPI: HPET 0x000000009CFE236C 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 18:03:45.877947 kernel: ACPI: MCFG 0x000000009CFE23A4 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 18:03:45.877955 kernel: ACPI: WAET 0x000000009CFE23E0 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 18:03:45.877962 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21e8-0x9cfe22db] Mar 20 18:03:45.877970 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21e7] Mar 20 18:03:45.877981 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] Mar 20 18:03:45.877990 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22dc-0x9cfe236b] Mar 20 18:03:45.877998 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe236c-0x9cfe23a3] Mar 20 18:03:45.878005 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23a4-0x9cfe23df] Mar 20 18:03:45.878013 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23e0-0x9cfe2407] Mar 20 18:03:45.878020 kernel: No NUMA configuration found Mar 20 18:03:45.878028 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] Mar 20 18:03:45.878035 kernel: NODE_DATA(0) allocated [mem 0x9cfd6000-0x9cfdbfff] Mar 20 18:03:45.878045 kernel: Zone ranges: Mar 20 18:03:45.878053 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 20 18:03:45.878060 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] Mar 20 18:03:45.878068 kernel: Normal empty Mar 20 18:03:45.878075 kernel: Movable zone start for each node Mar 20 18:03:45.878082 kernel: Early memory node ranges Mar 20 18:03:45.878090 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 20 18:03:45.878097 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] Mar 20 18:03:45.878105 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] Mar 20 18:03:45.878112 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 20 18:03:45.878122 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 20 18:03:45.878129 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Mar 20 18:03:45.878137 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 20 18:03:45.878144 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 20 18:03:45.878152 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 20 18:03:45.878159 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 20 18:03:45.878167 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 20 18:03:45.878174 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 20 18:03:45.878181 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 20 18:03:45.878191 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 20 18:03:45.878198 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 20 18:03:45.878206 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 20 18:03:45.878213 kernel: TSC deadline timer available Mar 20 18:03:45.878221 kernel: smpboot: Allowing 4 CPUs, 0 hotplug CPUs Mar 20 18:03:45.878228 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 20 18:03:45.878236 kernel: kvm-guest: KVM setup pv remote TLB flush Mar 20 18:03:45.878243 kernel: kvm-guest: setup PV sched yield Mar 20 18:03:45.878250 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 20 18:03:45.878260 kernel: Booting paravirtualized kernel on KVM Mar 20 18:03:45.878268 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 20 18:03:45.878275 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Mar 20 18:03:45.878283 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u524288 Mar 20 18:03:45.878291 kernel: pcpu-alloc: s197032 r8192 d32344 u524288 alloc=1*2097152 Mar 20 18:03:45.878298 kernel: pcpu-alloc: [0] 0 1 2 3 Mar 20 18:03:45.878305 kernel: kvm-guest: PV spinlocks enabled Mar 20 18:03:45.878313 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 20 18:03:45.878321 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=30d38910dcb9abcb2ae1fb8c4b62196472dfae1a70f494441b86ff0de2ee88c9 Mar 20 18:03:45.878332 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 20 18:03:45.878339 kernel: random: crng init done Mar 20 18:03:45.878346 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 20 18:03:45.878354 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 20 18:03:45.878361 kernel: Fallback order for Node 0: 0 Mar 20 18:03:45.878369 kernel: Built 1 zonelists, mobility grouping on. Total pages: 632732 Mar 20 18:03:45.878376 kernel: Policy zone: DMA32 Mar 20 18:03:45.878384 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 20 18:03:45.878394 kernel: Memory: 2430496K/2571752K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 140996K reserved, 0K cma-reserved) Mar 20 18:03:45.878401 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 20 18:03:45.878409 kernel: ftrace: allocating 37985 entries in 149 pages Mar 20 18:03:45.878416 kernel: ftrace: allocated 149 pages with 4 groups Mar 20 18:03:45.878424 kernel: Dynamic Preempt: voluntary Mar 20 18:03:45.878431 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 20 18:03:45.878439 kernel: rcu: RCU event tracing is enabled. Mar 20 18:03:45.878447 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 20 18:03:45.878455 kernel: Trampoline variant of Tasks RCU enabled. Mar 20 18:03:45.878464 kernel: Rude variant of Tasks RCU enabled. Mar 20 18:03:45.878472 kernel: Tracing variant of Tasks RCU enabled. Mar 20 18:03:45.878479 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 20 18:03:45.878487 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 20 18:03:45.878494 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Mar 20 18:03:45.878502 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 20 18:03:45.878509 kernel: Console: colour VGA+ 80x25 Mar 20 18:03:45.878517 kernel: printk: console [ttyS0] enabled Mar 20 18:03:45.878524 kernel: ACPI: Core revision 20230628 Mar 20 18:03:45.878532 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 20 18:03:45.878542 kernel: APIC: Switch to symmetric I/O mode setup Mar 20 18:03:45.878549 kernel: x2apic enabled Mar 20 18:03:45.878556 kernel: APIC: Switched APIC routing to: physical x2apic Mar 20 18:03:45.878564 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Mar 20 18:03:45.878572 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Mar 20 18:03:45.878579 kernel: kvm-guest: setup PV IPIs Mar 20 18:03:45.878596 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 20 18:03:45.878604 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 20 18:03:45.878612 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794746) Mar 20 18:03:45.878619 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 20 18:03:45.878627 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 20 18:03:45.878637 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 20 18:03:45.878645 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 20 18:03:45.878652 kernel: Spectre V2 : Mitigation: Retpolines Mar 20 18:03:45.878660 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 20 18:03:45.878668 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 20 18:03:45.878678 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Mar 20 18:03:45.878686 kernel: RETBleed: Mitigation: untrained return thunk Mar 20 18:03:45.878694 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 20 18:03:45.878702 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 20 18:03:45.878709 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Mar 20 18:03:45.878718 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Mar 20 18:03:45.878726 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Mar 20 18:03:45.878734 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 20 18:03:45.878743 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 20 18:03:45.878751 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 20 18:03:45.878759 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 20 18:03:45.878767 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 20 18:03:45.878775 kernel: Freeing SMP alternatives memory: 32K Mar 20 18:03:45.878783 kernel: pid_max: default: 32768 minimum: 301 Mar 20 18:03:45.878791 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 20 18:03:45.878798 kernel: landlock: Up and running. Mar 20 18:03:45.878806 kernel: SELinux: Initializing. Mar 20 18:03:45.878819 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 18:03:45.878832 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 18:03:45.878848 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Mar 20 18:03:45.878861 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 20 18:03:45.878874 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 20 18:03:45.878890 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 20 18:03:45.878902 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 20 18:03:45.878935 kernel: ... version: 0 Mar 20 18:03:45.878948 kernel: ... bit width: 48 Mar 20 18:03:45.878966 kernel: ... generic registers: 6 Mar 20 18:03:45.878979 kernel: ... value mask: 0000ffffffffffff Mar 20 18:03:45.878992 kernel: ... max period: 00007fffffffffff Mar 20 18:03:45.879000 kernel: ... fixed-purpose events: 0 Mar 20 18:03:45.879007 kernel: ... event mask: 000000000000003f Mar 20 18:03:45.879017 kernel: signal: max sigframe size: 1776 Mar 20 18:03:45.879025 kernel: rcu: Hierarchical SRCU implementation. Mar 20 18:03:45.879035 kernel: rcu: Max phase no-delay instances is 400. Mar 20 18:03:45.879043 kernel: smp: Bringing up secondary CPUs ... Mar 20 18:03:45.879053 kernel: smpboot: x86: Booting SMP configuration: Mar 20 18:03:45.879061 kernel: .... node #0, CPUs: #1 #2 #3 Mar 20 18:03:45.879069 kernel: smp: Brought up 1 node, 4 CPUs Mar 20 18:03:45.879076 kernel: smpboot: Max logical packages: 1 Mar 20 18:03:45.879084 kernel: smpboot: Total of 4 processors activated (22357.96 BogoMIPS) Mar 20 18:03:45.879092 kernel: devtmpfs: initialized Mar 20 18:03:45.879100 kernel: x86/mm: Memory block size: 128MB Mar 20 18:03:45.879108 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 20 18:03:45.879115 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 20 18:03:45.879126 kernel: pinctrl core: initialized pinctrl subsystem Mar 20 18:03:45.879134 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 20 18:03:45.879141 kernel: audit: initializing netlink subsys (disabled) Mar 20 18:03:45.879149 kernel: audit: type=2000 audit(1742493825.874:1): state=initialized audit_enabled=0 res=1 Mar 20 18:03:45.879157 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 20 18:03:45.879165 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 20 18:03:45.879172 kernel: cpuidle: using governor menu Mar 20 18:03:45.879180 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 20 18:03:45.879188 kernel: dca service started, version 1.12.1 Mar 20 18:03:45.879205 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 20 18:03:45.879214 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 20 18:03:45.879222 kernel: PCI: Using configuration type 1 for base access Mar 20 18:03:45.879229 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 20 18:03:45.879244 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 20 18:03:45.879252 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 20 18:03:45.879260 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 20 18:03:45.879268 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 20 18:03:45.879276 kernel: ACPI: Added _OSI(Module Device) Mar 20 18:03:45.879285 kernel: ACPI: Added _OSI(Processor Device) Mar 20 18:03:45.879293 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 20 18:03:45.879301 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 20 18:03:45.879309 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 20 18:03:45.879317 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 20 18:03:45.879324 kernel: ACPI: Interpreter enabled Mar 20 18:03:45.879332 kernel: ACPI: PM: (supports S0 S3 S5) Mar 20 18:03:45.879340 kernel: ACPI: Using IOAPIC for interrupt routing Mar 20 18:03:45.879348 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 20 18:03:45.879357 kernel: PCI: Using E820 reservations for host bridge windows Mar 20 18:03:45.879365 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 20 18:03:45.879373 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 20 18:03:45.879548 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 20 18:03:45.879679 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 20 18:03:45.879802 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 20 18:03:45.879812 kernel: PCI host bridge to bus 0000:00 Mar 20 18:03:45.879968 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 20 18:03:45.880089 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 20 18:03:45.880202 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 20 18:03:45.880313 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Mar 20 18:03:45.880466 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 20 18:03:45.880580 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Mar 20 18:03:45.880691 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 20 18:03:45.880833 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 20 18:03:45.880994 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 Mar 20 18:03:45.881124 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfd000000-0xfdffffff pref] Mar 20 18:03:45.881266 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfebd0000-0xfebd0fff] Mar 20 18:03:45.881444 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfebc0000-0xfebcffff pref] Mar 20 18:03:45.881569 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 20 18:03:45.881708 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 Mar 20 18:03:45.881838 kernel: pci 0000:00:02.0: reg 0x10: [io 0xc0c0-0xc0df] Mar 20 18:03:45.882024 kernel: pci 0000:00:02.0: reg 0x14: [mem 0xfebd1000-0xfebd1fff] Mar 20 18:03:45.882156 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfe000000-0xfe003fff 64bit pref] Mar 20 18:03:45.882288 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 Mar 20 18:03:45.882427 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc000-0xc07f] Mar 20 18:03:45.882561 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfebd2000-0xfebd2fff] Mar 20 18:03:45.882691 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe004000-0xfe007fff 64bit pref] Mar 20 18:03:45.882839 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 Mar 20 18:03:45.882999 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc0e0-0xc0ff] Mar 20 18:03:45.883124 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfebd3000-0xfebd3fff] Mar 20 18:03:45.883246 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe008000-0xfe00bfff 64bit pref] Mar 20 18:03:45.883367 kernel: pci 0000:00:04.0: reg 0x30: [mem 0xfeb80000-0xfebbffff pref] Mar 20 18:03:45.883499 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 20 18:03:45.883627 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 20 18:03:45.883757 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 20 18:03:45.883880 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc100-0xc11f] Mar 20 18:03:45.884055 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfebd4000-0xfebd4fff] Mar 20 18:03:45.884188 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 20 18:03:45.884313 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 20 18:03:45.884323 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 20 18:03:45.884336 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 20 18:03:45.884344 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 20 18:03:45.884352 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 20 18:03:45.884360 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 20 18:03:45.884368 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 20 18:03:45.884375 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 20 18:03:45.884383 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 20 18:03:45.884391 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 20 18:03:45.884399 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 20 18:03:45.884409 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 20 18:03:45.884417 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 20 18:03:45.884425 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 20 18:03:45.884433 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 20 18:03:45.884440 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 20 18:03:45.884448 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 20 18:03:45.884456 kernel: iommu: Default domain type: Translated Mar 20 18:03:45.884464 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 20 18:03:45.884472 kernel: PCI: Using ACPI for IRQ routing Mar 20 18:03:45.884482 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 20 18:03:45.884490 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 20 18:03:45.884498 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] Mar 20 18:03:45.884622 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 20 18:03:45.884753 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 20 18:03:45.884881 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 20 18:03:45.884892 kernel: vgaarb: loaded Mar 20 18:03:45.884900 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 20 18:03:45.884942 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 20 18:03:45.884950 kernel: clocksource: Switched to clocksource kvm-clock Mar 20 18:03:45.884958 kernel: VFS: Disk quotas dquot_6.6.0 Mar 20 18:03:45.884966 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 20 18:03:45.884974 kernel: pnp: PnP ACPI init Mar 20 18:03:45.885116 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 20 18:03:45.885128 kernel: pnp: PnP ACPI: found 6 devices Mar 20 18:03:45.885137 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 20 18:03:45.885147 kernel: NET: Registered PF_INET protocol family Mar 20 18:03:45.885155 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 20 18:03:45.885163 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 20 18:03:45.885171 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 20 18:03:45.885179 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 20 18:03:45.885187 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 20 18:03:45.885195 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 20 18:03:45.885203 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 18:03:45.885211 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 18:03:45.885221 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 20 18:03:45.885229 kernel: NET: Registered PF_XDP protocol family Mar 20 18:03:45.885344 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 20 18:03:45.885457 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 20 18:03:45.885569 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 20 18:03:45.885681 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Mar 20 18:03:45.885793 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 20 18:03:45.885963 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Mar 20 18:03:45.885979 kernel: PCI: CLS 0 bytes, default 64 Mar 20 18:03:45.885987 kernel: Initialise system trusted keyrings Mar 20 18:03:45.885995 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 20 18:03:45.886003 kernel: Key type asymmetric registered Mar 20 18:03:45.886011 kernel: Asymmetric key parser 'x509' registered Mar 20 18:03:45.886019 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 20 18:03:45.886027 kernel: io scheduler mq-deadline registered Mar 20 18:03:45.886035 kernel: io scheduler kyber registered Mar 20 18:03:45.886042 kernel: io scheduler bfq registered Mar 20 18:03:45.886050 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 20 18:03:45.886061 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 20 18:03:45.886069 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 20 18:03:45.886077 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 20 18:03:45.886085 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 20 18:03:45.886093 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 20 18:03:45.886100 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 20 18:03:45.886108 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 20 18:03:45.886116 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 20 18:03:45.886263 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 20 18:03:45.886384 kernel: rtc_cmos 00:04: registered as rtc0 Mar 20 18:03:45.886395 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Mar 20 18:03:45.886507 kernel: rtc_cmos 00:04: setting system clock to 2025-03-20T18:03:45 UTC (1742493825) Mar 20 18:03:45.886620 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 20 18:03:45.886631 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 20 18:03:45.886639 kernel: NET: Registered PF_INET6 protocol family Mar 20 18:03:45.886646 kernel: Segment Routing with IPv6 Mar 20 18:03:45.886657 kernel: In-situ OAM (IOAM) with IPv6 Mar 20 18:03:45.886665 kernel: NET: Registered PF_PACKET protocol family Mar 20 18:03:45.886673 kernel: Key type dns_resolver registered Mar 20 18:03:45.886681 kernel: IPI shorthand broadcast: enabled Mar 20 18:03:45.886689 kernel: sched_clock: Marking stable (530005226, 104341969)->(679460817, -45113622) Mar 20 18:03:45.886697 kernel: registered taskstats version 1 Mar 20 18:03:45.886705 kernel: Loading compiled-in X.509 certificates Mar 20 18:03:45.886713 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 2c0605e0441a1fddfb1f70673dce1f0d470be9b5' Mar 20 18:03:45.886720 kernel: Key type .fscrypt registered Mar 20 18:03:45.886728 kernel: Key type fscrypt-provisioning registered Mar 20 18:03:45.886738 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 20 18:03:45.886746 kernel: ima: Allocated hash algorithm: sha1 Mar 20 18:03:45.886754 kernel: ima: No architecture policies found Mar 20 18:03:45.886762 kernel: clk: Disabling unused clocks Mar 20 18:03:45.886769 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 20 18:03:45.886777 kernel: Write protecting the kernel read-only data: 40960k Mar 20 18:03:45.886785 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 20 18:03:45.886793 kernel: Run /init as init process Mar 20 18:03:45.886803 kernel: with arguments: Mar 20 18:03:45.886810 kernel: /init Mar 20 18:03:45.886818 kernel: with environment: Mar 20 18:03:45.886826 kernel: HOME=/ Mar 20 18:03:45.886834 kernel: TERM=linux Mar 20 18:03:45.886841 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 20 18:03:45.886850 systemd[1]: Successfully made /usr/ read-only. Mar 20 18:03:45.886861 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 18:03:45.886873 systemd[1]: Detected virtualization kvm. Mar 20 18:03:45.886881 systemd[1]: Detected architecture x86-64. Mar 20 18:03:45.886889 systemd[1]: Running in initrd. Mar 20 18:03:45.886898 systemd[1]: No hostname configured, using default hostname. Mar 20 18:03:45.886925 systemd[1]: Hostname set to . Mar 20 18:03:45.886934 systemd[1]: Initializing machine ID from VM UUID. Mar 20 18:03:45.886942 systemd[1]: Queued start job for default target initrd.target. Mar 20 18:03:45.886952 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 18:03:45.886963 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 18:03:45.886983 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 20 18:03:45.886994 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 18:03:45.887003 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 20 18:03:45.887012 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 20 18:03:45.887024 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 20 18:03:45.887033 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 20 18:03:45.887042 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 18:03:45.887050 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 18:03:45.887059 systemd[1]: Reached target paths.target - Path Units. Mar 20 18:03:45.887068 systemd[1]: Reached target slices.target - Slice Units. Mar 20 18:03:45.887076 systemd[1]: Reached target swap.target - Swaps. Mar 20 18:03:45.887085 systemd[1]: Reached target timers.target - Timer Units. Mar 20 18:03:45.887096 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 18:03:45.887104 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 18:03:45.887113 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 20 18:03:45.887122 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 20 18:03:45.887130 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 18:03:45.887139 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 18:03:45.887148 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 18:03:45.887156 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 18:03:45.887165 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 20 18:03:45.887176 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 18:03:45.887184 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 20 18:03:45.887193 systemd[1]: Starting systemd-fsck-usr.service... Mar 20 18:03:45.887201 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 18:03:45.887210 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 18:03:45.887218 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 18:03:45.887227 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 20 18:03:45.887236 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 18:03:45.887247 systemd[1]: Finished systemd-fsck-usr.service. Mar 20 18:03:45.887276 systemd-journald[193]: Collecting audit messages is disabled. Mar 20 18:03:45.887300 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 20 18:03:45.887309 systemd-journald[193]: Journal started Mar 20 18:03:45.887330 systemd-journald[193]: Runtime Journal (/run/log/journal/ea181759030a43b68cbf5e6e1c047280) is 6M, max 48.3M, 42.3M free. Mar 20 18:03:45.879261 systemd-modules-load[194]: Inserted module 'overlay' Mar 20 18:03:45.913508 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 20 18:03:45.913529 kernel: Bridge firewalling registered Mar 20 18:03:45.906043 systemd-modules-load[194]: Inserted module 'br_netfilter' Mar 20 18:03:45.919700 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 18:03:45.920210 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 18:03:45.922551 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 18:03:45.924990 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 18:03:45.931224 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 18:03:45.934445 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 18:03:45.942424 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 18:03:45.945424 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 18:03:45.951372 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 18:03:45.955023 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 18:03:45.957490 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 18:03:45.960195 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 18:03:45.962389 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 18:03:45.963200 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 20 18:03:45.985619 dracut-cmdline[230]: dracut-dracut-053 Mar 20 18:03:45.988414 dracut-cmdline[230]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=30d38910dcb9abcb2ae1fb8c4b62196472dfae1a70f494441b86ff0de2ee88c9 Mar 20 18:03:46.008716 systemd-resolved[228]: Positive Trust Anchors: Mar 20 18:03:46.008733 systemd-resolved[228]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 18:03:46.008763 systemd-resolved[228]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 18:03:46.011271 systemd-resolved[228]: Defaulting to hostname 'linux'. Mar 20 18:03:46.012333 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 18:03:46.018415 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 18:03:46.069952 kernel: SCSI subsystem initialized Mar 20 18:03:46.079944 kernel: Loading iSCSI transport class v2.0-870. Mar 20 18:03:46.091962 kernel: iscsi: registered transport (tcp) Mar 20 18:03:46.112956 kernel: iscsi: registered transport (qla4xxx) Mar 20 18:03:46.113013 kernel: QLogic iSCSI HBA Driver Mar 20 18:03:46.161824 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 20 18:03:46.164437 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 20 18:03:46.203007 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 20 18:03:46.203080 kernel: device-mapper: uevent: version 1.0.3 Mar 20 18:03:46.204039 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 20 18:03:46.245945 kernel: raid6: avx2x4 gen() 30698 MB/s Mar 20 18:03:46.262935 kernel: raid6: avx2x2 gen() 30117 MB/s Mar 20 18:03:46.280202 kernel: raid6: avx2x1 gen() 20525 MB/s Mar 20 18:03:46.280217 kernel: raid6: using algorithm avx2x4 gen() 30698 MB/s Mar 20 18:03:46.298069 kernel: raid6: .... xor() 7174 MB/s, rmw enabled Mar 20 18:03:46.298085 kernel: raid6: using avx2x2 recovery algorithm Mar 20 18:03:46.318941 kernel: xor: automatically using best checksumming function avx Mar 20 18:03:46.471950 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 20 18:03:46.485671 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 20 18:03:46.487864 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 18:03:46.518878 systemd-udevd[413]: Using default interface naming scheme 'v255'. Mar 20 18:03:46.525180 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 18:03:46.527252 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 20 18:03:46.553199 dracut-pre-trigger[415]: rd.md=0: removing MD RAID activation Mar 20 18:03:46.587003 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 18:03:46.588478 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 18:03:46.673148 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 18:03:46.675412 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 20 18:03:46.697309 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 20 18:03:46.700193 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 18:03:46.702954 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 18:03:46.705382 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 18:03:46.709036 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 20 18:03:46.710593 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Mar 20 18:03:46.727764 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 20 18:03:46.727971 kernel: cryptd: max_cpu_qlen set to 1000 Mar 20 18:03:46.727984 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 20 18:03:46.728000 kernel: GPT:9289727 != 19775487 Mar 20 18:03:46.728011 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 20 18:03:46.728021 kernel: GPT:9289727 != 19775487 Mar 20 18:03:46.728031 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 20 18:03:46.728041 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 18:03:46.740042 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 20 18:03:46.745644 kernel: AVX2 version of gcm_enc/dec engaged. Mar 20 18:03:46.745667 kernel: libata version 3.00 loaded. Mar 20 18:03:46.745678 kernel: AES CTR mode by8 optimization enabled Mar 20 18:03:46.750945 kernel: ahci 0000:00:1f.2: version 3.0 Mar 20 18:03:46.784640 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 20 18:03:46.784657 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 20 18:03:46.784823 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 20 18:03:46.785056 kernel: scsi host0: ahci Mar 20 18:03:46.785211 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (469) Mar 20 18:03:46.785223 kernel: BTRFS: device fsid 5af3bf9c-0d36-4793-88d6-028c3ca48c10 devid 1 transid 38 /dev/vda3 scanned by (udev-worker) (460) Mar 20 18:03:46.785234 kernel: scsi host1: ahci Mar 20 18:03:46.785381 kernel: scsi host2: ahci Mar 20 18:03:46.785532 kernel: scsi host3: ahci Mar 20 18:03:46.785676 kernel: scsi host4: ahci Mar 20 18:03:46.785832 kernel: scsi host5: ahci Mar 20 18:03:46.786767 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 Mar 20 18:03:46.786784 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 Mar 20 18:03:46.786795 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 Mar 20 18:03:46.786807 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 Mar 20 18:03:46.786821 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 Mar 20 18:03:46.786843 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 Mar 20 18:03:46.753065 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 18:03:46.753198 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 18:03:46.756264 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 18:03:46.757680 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 18:03:46.757810 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 18:03:46.761463 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 18:03:46.764221 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 18:03:46.789639 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 20 18:03:46.814531 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 20 18:03:46.841639 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 18:03:46.841964 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 18:03:46.849637 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 20 18:03:46.849725 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 20 18:03:46.851068 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 20 18:03:46.852068 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 18:03:46.872500 disk-uuid[554]: Primary Header is updated. Mar 20 18:03:46.872500 disk-uuid[554]: Secondary Entries is updated. Mar 20 18:03:46.872500 disk-uuid[554]: Secondary Header is updated. Mar 20 18:03:46.875939 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 18:03:46.879253 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 18:03:46.882233 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 18:03:47.094130 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 20 18:03:47.094204 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 20 18:03:47.094220 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 20 18:03:47.095947 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 20 18:03:47.096024 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 20 18:03:47.096943 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 20 18:03:47.097943 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 20 18:03:47.099353 kernel: ata3.00: applying bridge limits Mar 20 18:03:47.099368 kernel: ata3.00: configured for UDMA/100 Mar 20 18:03:47.099936 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 20 18:03:47.154963 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 20 18:03:47.168586 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 20 18:03:47.168600 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 20 18:03:47.882946 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 18:03:47.883065 disk-uuid[560]: The operation has completed successfully. Mar 20 18:03:47.916468 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 20 18:03:47.916594 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 20 18:03:47.946701 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 20 18:03:47.966230 sh[590]: Success Mar 20 18:03:47.978949 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 20 18:03:48.013635 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 20 18:03:48.015596 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 20 18:03:48.030854 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 20 18:03:48.038144 kernel: BTRFS info (device dm-0): first mount of filesystem 5af3bf9c-0d36-4793-88d6-028c3ca48c10 Mar 20 18:03:48.038174 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 20 18:03:48.038185 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 20 18:03:48.040520 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 20 18:03:48.040533 kernel: BTRFS info (device dm-0): using free space tree Mar 20 18:03:48.044638 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 20 18:03:48.045237 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 20 18:03:48.046083 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 20 18:03:48.047958 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 20 18:03:48.072812 kernel: BTRFS info (device vda6): first mount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 18:03:48.072867 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 18:03:48.072879 kernel: BTRFS info (device vda6): using free space tree Mar 20 18:03:48.075955 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 18:03:48.079934 kernel: BTRFS info (device vda6): last unmount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 18:03:48.086220 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 20 18:03:48.088387 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 20 18:03:48.160571 ignition[683]: Ignition 2.20.0 Mar 20 18:03:48.160585 ignition[683]: Stage: fetch-offline Mar 20 18:03:48.160619 ignition[683]: no configs at "/usr/lib/ignition/base.d" Mar 20 18:03:48.160629 ignition[683]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 20 18:03:48.163029 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 18:03:48.160728 ignition[683]: parsed url from cmdline: "" Mar 20 18:03:48.166891 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 18:03:48.160732 ignition[683]: no config URL provided Mar 20 18:03:48.160737 ignition[683]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 18:03:48.160747 ignition[683]: no config at "/usr/lib/ignition/user.ign" Mar 20 18:03:48.160775 ignition[683]: op(1): [started] loading QEMU firmware config module Mar 20 18:03:48.160780 ignition[683]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 20 18:03:48.169487 ignition[683]: op(1): [finished] loading QEMU firmware config module Mar 20 18:03:48.169509 ignition[683]: QEMU firmware config was not found. Ignoring... Mar 20 18:03:48.206760 systemd-networkd[777]: lo: Link UP Mar 20 18:03:48.206770 systemd-networkd[777]: lo: Gained carrier Mar 20 18:03:48.208680 systemd-networkd[777]: Enumeration completed Mar 20 18:03:48.208775 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 18:03:48.209081 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 18:03:48.209086 systemd-networkd[777]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 18:03:48.209886 systemd-networkd[777]: eth0: Link UP Mar 20 18:03:48.209889 systemd-networkd[777]: eth0: Gained carrier Mar 20 18:03:48.209895 systemd-networkd[777]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 18:03:48.210414 systemd[1]: Reached target network.target - Network. Mar 20 18:03:48.227381 ignition[683]: parsing config with SHA512: fb54bfeff020f329477e2c96137f74e78b9b37a8edb15cbd8452f4c1387c0840a65118b043b3cb1c8224006275d8fb75f98790ec3ffec87b3b18896105c47708 Mar 20 18:03:48.229959 systemd-networkd[777]: eth0: DHCPv4 address 10.0.0.124/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 20 18:03:48.233385 unknown[683]: fetched base config from "system" Mar 20 18:03:48.233520 unknown[683]: fetched user config from "qemu" Mar 20 18:03:48.233958 ignition[683]: fetch-offline: fetch-offline passed Mar 20 18:03:48.234036 ignition[683]: Ignition finished successfully Mar 20 18:03:48.238609 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 18:03:48.238847 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 20 18:03:48.240976 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 20 18:03:48.272212 ignition[782]: Ignition 2.20.0 Mar 20 18:03:48.272228 ignition[782]: Stage: kargs Mar 20 18:03:48.272416 ignition[782]: no configs at "/usr/lib/ignition/base.d" Mar 20 18:03:48.272432 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 20 18:03:48.273527 ignition[782]: kargs: kargs passed Mar 20 18:03:48.273580 ignition[782]: Ignition finished successfully Mar 20 18:03:48.277669 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 20 18:03:48.279703 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 20 18:03:48.309691 ignition[790]: Ignition 2.20.0 Mar 20 18:03:48.309702 ignition[790]: Stage: disks Mar 20 18:03:48.309872 ignition[790]: no configs at "/usr/lib/ignition/base.d" Mar 20 18:03:48.309883 ignition[790]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 20 18:03:48.310674 ignition[790]: disks: disks passed Mar 20 18:03:48.313187 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 20 18:03:48.310719 ignition[790]: Ignition finished successfully Mar 20 18:03:48.314404 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 20 18:03:48.315953 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 20 18:03:48.318069 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 18:03:48.319089 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 18:03:48.320809 systemd[1]: Reached target basic.target - Basic System. Mar 20 18:03:48.322651 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 20 18:03:48.346675 systemd-fsck[801]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 20 18:03:48.352940 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 20 18:03:48.356900 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 20 18:03:48.450936 kernel: EXT4-fs (vda9): mounted filesystem bf9c440e-9fee-4e54-8539-b83f5a9eea2f r/w with ordered data mode. Quota mode: none. Mar 20 18:03:48.451900 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 20 18:03:48.452590 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 20 18:03:48.454033 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 18:03:48.456310 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 20 18:03:48.457665 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 20 18:03:48.457706 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 20 18:03:48.457729 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 18:03:48.469997 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 20 18:03:48.473375 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 20 18:03:48.478051 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (809) Mar 20 18:03:48.478071 kernel: BTRFS info (device vda6): first mount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 18:03:48.478082 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 18:03:48.478092 kernel: BTRFS info (device vda6): using free space tree Mar 20 18:03:48.480931 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 18:03:48.482886 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 18:03:48.508636 initrd-setup-root[833]: cut: /sysroot/etc/passwd: No such file or directory Mar 20 18:03:48.513226 initrd-setup-root[840]: cut: /sysroot/etc/group: No such file or directory Mar 20 18:03:48.517465 initrd-setup-root[847]: cut: /sysroot/etc/shadow: No such file or directory Mar 20 18:03:48.521128 initrd-setup-root[854]: cut: /sysroot/etc/gshadow: No such file or directory Mar 20 18:03:48.601207 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 20 18:03:48.603719 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 20 18:03:48.605327 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 20 18:03:48.620935 kernel: BTRFS info (device vda6): last unmount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 18:03:48.642102 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 20 18:03:48.653885 ignition[923]: INFO : Ignition 2.20.0 Mar 20 18:03:48.653885 ignition[923]: INFO : Stage: mount Mar 20 18:03:48.655569 ignition[923]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 18:03:48.655569 ignition[923]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 20 18:03:48.655569 ignition[923]: INFO : mount: mount passed Mar 20 18:03:48.655569 ignition[923]: INFO : Ignition finished successfully Mar 20 18:03:48.657299 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 20 18:03:48.658689 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 20 18:03:49.037972 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 20 18:03:49.039815 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 18:03:49.066928 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (936) Mar 20 18:03:49.066974 kernel: BTRFS info (device vda6): first mount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 18:03:49.066993 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 18:03:49.068931 kernel: BTRFS info (device vda6): using free space tree Mar 20 18:03:49.070942 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 18:03:49.072999 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 18:03:49.103680 ignition[953]: INFO : Ignition 2.20.0 Mar 20 18:03:49.103680 ignition[953]: INFO : Stage: files Mar 20 18:03:49.105543 ignition[953]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 18:03:49.105543 ignition[953]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 20 18:03:49.105543 ignition[953]: DEBUG : files: compiled without relabeling support, skipping Mar 20 18:03:49.109398 ignition[953]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 20 18:03:49.109398 ignition[953]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 20 18:03:49.112809 ignition[953]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 20 18:03:49.112809 ignition[953]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 20 18:03:49.115925 ignition[953]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 20 18:03:49.115925 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 18:03:49.115925 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 20 18:03:49.113058 unknown[953]: wrote ssh authorized keys file for user: core Mar 20 18:03:49.157329 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 20 18:03:49.372860 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 18:03:49.372860 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 20 18:03:49.376871 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 18:03:49.378778 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 20 18:03:49.855692 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 20 18:03:49.975234 systemd-networkd[777]: eth0: Gained IPv6LL Mar 20 18:03:50.194025 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 18:03:50.194025 ignition[953]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 20 18:03:50.198035 ignition[953]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 20 18:03:50.216974 ignition[953]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 20 18:03:50.220801 ignition[953]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 20 18:03:50.222356 ignition[953]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 20 18:03:50.222356 ignition[953]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 20 18:03:50.222356 ignition[953]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 20 18:03:50.222356 ignition[953]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 20 18:03:50.222356 ignition[953]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 20 18:03:50.222356 ignition[953]: INFO : files: files passed Mar 20 18:03:50.222356 ignition[953]: INFO : Ignition finished successfully Mar 20 18:03:50.233387 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 20 18:03:50.236028 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 20 18:03:50.238386 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 20 18:03:50.250328 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 20 18:03:50.250461 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 20 18:03:50.254951 initrd-setup-root-after-ignition[982]: grep: /sysroot/oem/oem-release: No such file or directory Mar 20 18:03:50.259240 initrd-setup-root-after-ignition[985]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 18:03:50.259240 initrd-setup-root-after-ignition[985]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 20 18:03:50.262492 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 18:03:50.266230 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 18:03:50.267726 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 20 18:03:50.271000 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 20 18:03:50.327445 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 20 18:03:50.327577 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 20 18:03:50.329964 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 20 18:03:50.331044 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 20 18:03:50.331404 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 20 18:03:50.334796 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 20 18:03:50.369727 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 18:03:50.371290 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 20 18:03:50.397540 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 20 18:03:50.397719 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 18:03:50.401071 systemd[1]: Stopped target timers.target - Timer Units. Mar 20 18:03:50.402197 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 20 18:03:50.402327 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 18:03:50.405458 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 20 18:03:50.407016 systemd[1]: Stopped target basic.target - Basic System. Mar 20 18:03:50.409198 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 20 18:03:50.411302 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 18:03:50.413373 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 20 18:03:50.415524 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 20 18:03:50.417635 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 18:03:50.419955 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 20 18:03:50.421882 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 20 18:03:50.424144 systemd[1]: Stopped target swap.target - Swaps. Mar 20 18:03:50.426020 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 20 18:03:50.426140 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 20 18:03:50.428557 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 20 18:03:50.430041 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 18:03:50.432134 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 20 18:03:50.432228 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 18:03:50.434504 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 20 18:03:50.434636 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 20 18:03:50.437057 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 20 18:03:50.437172 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 18:03:50.439103 systemd[1]: Stopped target paths.target - Path Units. Mar 20 18:03:50.440944 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 20 18:03:50.444979 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 18:03:50.446732 systemd[1]: Stopped target slices.target - Slice Units. Mar 20 18:03:50.448712 systemd[1]: Stopped target sockets.target - Socket Units. Mar 20 18:03:50.450485 systemd[1]: iscsid.socket: Deactivated successfully. Mar 20 18:03:50.450586 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 18:03:50.452513 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 20 18:03:50.452597 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 18:03:50.455046 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 20 18:03:50.455188 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 18:03:50.457115 systemd[1]: ignition-files.service: Deactivated successfully. Mar 20 18:03:50.457264 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 20 18:03:50.459903 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 20 18:03:50.461590 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 20 18:03:50.461743 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 18:03:50.479598 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 20 18:03:50.480566 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 20 18:03:50.480733 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 18:03:50.483233 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 20 18:03:50.483380 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 18:03:50.490787 ignition[1009]: INFO : Ignition 2.20.0 Mar 20 18:03:50.490787 ignition[1009]: INFO : Stage: umount Mar 20 18:03:50.493015 ignition[1009]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 18:03:50.493015 ignition[1009]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 20 18:03:50.493015 ignition[1009]: INFO : umount: umount passed Mar 20 18:03:50.493015 ignition[1009]: INFO : Ignition finished successfully Mar 20 18:03:50.493594 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 20 18:03:50.493711 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 20 18:03:50.495718 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 20 18:03:50.495827 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 20 18:03:50.500175 systemd[1]: Stopped target network.target - Network. Mar 20 18:03:50.501562 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 20 18:03:50.501623 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 20 18:03:50.501730 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 20 18:03:50.501776 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 20 18:03:50.502402 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 20 18:03:50.502447 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 20 18:03:50.502618 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 20 18:03:50.502660 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 20 18:03:50.502979 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 20 18:03:50.503137 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 20 18:03:50.512595 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 20 18:03:50.512722 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 20 18:03:50.517670 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 20 18:03:50.517827 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 20 18:03:50.518509 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 20 18:03:50.518563 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 18:03:50.523339 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 20 18:03:50.523668 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 20 18:03:50.523864 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 20 18:03:50.527380 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 20 18:03:50.528052 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 20 18:03:50.528120 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 20 18:03:50.531152 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 20 18:03:50.532531 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 20 18:03:50.532597 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 18:03:50.535488 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 20 18:03:50.535542 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 20 18:03:50.538123 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 20 18:03:50.538176 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 20 18:03:50.540496 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 18:03:50.543864 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 20 18:03:50.559262 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 20 18:03:50.559428 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 20 18:03:50.564744 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 20 18:03:50.564988 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 18:03:50.567516 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 20 18:03:50.567582 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 20 18:03:50.569830 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 20 18:03:50.569883 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 18:03:50.571768 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 20 18:03:50.571841 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 20 18:03:50.573865 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 20 18:03:50.573946 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 20 18:03:50.575808 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 18:03:50.575873 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 18:03:50.579200 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 20 18:03:50.580325 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 20 18:03:50.580391 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 18:03:50.582656 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 18:03:50.582722 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 18:03:50.586074 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 20 18:03:50.586154 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 20 18:03:50.598006 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 20 18:03:50.598151 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 20 18:03:50.666524 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 20 18:03:50.666666 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 20 18:03:50.668709 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 20 18:03:50.670351 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 20 18:03:50.670406 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 20 18:03:50.673234 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 20 18:03:50.692538 systemd[1]: Switching root. Mar 20 18:03:50.723026 systemd-journald[193]: Journal stopped Mar 20 18:03:51.877605 systemd-journald[193]: Received SIGTERM from PID 1 (systemd). Mar 20 18:03:51.877669 kernel: SELinux: policy capability network_peer_controls=1 Mar 20 18:03:51.877690 kernel: SELinux: policy capability open_perms=1 Mar 20 18:03:51.877702 kernel: SELinux: policy capability extended_socket_class=1 Mar 20 18:03:51.877713 kernel: SELinux: policy capability always_check_network=0 Mar 20 18:03:51.877729 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 20 18:03:51.877741 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 20 18:03:51.877752 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 20 18:03:51.877763 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 20 18:03:51.877783 kernel: audit: type=1403 audit(1742493831.069:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 20 18:03:51.877801 systemd[1]: Successfully loaded SELinux policy in 41.051ms. Mar 20 18:03:51.877829 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 13.398ms. Mar 20 18:03:51.877844 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 18:03:51.877856 systemd[1]: Detected virtualization kvm. Mar 20 18:03:51.877868 systemd[1]: Detected architecture x86-64. Mar 20 18:03:51.877881 systemd[1]: Detected first boot. Mar 20 18:03:51.877893 systemd[1]: Initializing machine ID from VM UUID. Mar 20 18:03:51.877905 zram_generator::config[1056]: No configuration found. Mar 20 18:03:51.877931 kernel: Guest personality initialized and is inactive Mar 20 18:03:51.877946 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 20 18:03:51.877957 kernel: Initialized host personality Mar 20 18:03:51.877969 kernel: NET: Registered PF_VSOCK protocol family Mar 20 18:03:51.877980 systemd[1]: Populated /etc with preset unit settings. Mar 20 18:03:51.877993 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 20 18:03:51.878007 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 20 18:03:51.878019 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 20 18:03:51.878032 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 20 18:03:51.878044 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 20 18:03:51.878059 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 20 18:03:51.878071 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 20 18:03:51.878083 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 20 18:03:51.878096 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 20 18:03:51.878108 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 20 18:03:51.878122 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 20 18:03:51.878134 systemd[1]: Created slice user.slice - User and Session Slice. Mar 20 18:03:51.878147 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 18:03:51.878159 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 18:03:51.878174 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 20 18:03:51.878186 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 20 18:03:51.878199 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 20 18:03:51.878212 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 18:03:51.878224 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 20 18:03:51.878236 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 18:03:51.878248 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 20 18:03:51.878263 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 20 18:03:51.878275 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 20 18:03:51.878287 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 20 18:03:51.878299 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 18:03:51.878312 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 18:03:51.878324 systemd[1]: Reached target slices.target - Slice Units. Mar 20 18:03:51.878336 systemd[1]: Reached target swap.target - Swaps. Mar 20 18:03:51.878348 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 20 18:03:51.878360 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 20 18:03:51.878372 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 20 18:03:51.878388 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 18:03:51.878400 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 18:03:51.878413 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 18:03:51.878425 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 20 18:03:51.878437 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 20 18:03:51.878449 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 20 18:03:51.878467 systemd[1]: Mounting media.mount - External Media Directory... Mar 20 18:03:51.878480 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:51.878495 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 20 18:03:51.878507 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 20 18:03:51.878520 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 20 18:03:51.878533 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 20 18:03:51.878546 systemd[1]: Reached target machines.target - Containers. Mar 20 18:03:51.878558 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 20 18:03:51.878571 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 18:03:51.878584 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 18:03:51.878597 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 20 18:03:51.878612 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 18:03:51.878624 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 18:03:51.878637 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 18:03:51.878649 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 20 18:03:51.878664 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 18:03:51.878676 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 20 18:03:51.878689 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 20 18:03:51.878702 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 20 18:03:51.878716 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 20 18:03:51.878729 systemd[1]: Stopped systemd-fsck-usr.service. Mar 20 18:03:51.878742 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 18:03:51.878755 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 18:03:51.878767 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 18:03:51.878786 kernel: loop: module loaded Mar 20 18:03:51.878799 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 20 18:03:51.878810 kernel: fuse: init (API version 7.39) Mar 20 18:03:51.878822 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 20 18:03:51.878853 systemd-journald[1119]: Collecting audit messages is disabled. Mar 20 18:03:51.878876 systemd-journald[1119]: Journal started Mar 20 18:03:51.878901 systemd-journald[1119]: Runtime Journal (/run/log/journal/ea181759030a43b68cbf5e6e1c047280) is 6M, max 48.3M, 42.3M free. Mar 20 18:03:51.883690 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 20 18:03:51.883728 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 18:03:51.677600 systemd[1]: Queued start job for default target multi-user.target. Mar 20 18:03:51.692856 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 20 18:03:51.693369 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 20 18:03:51.901957 systemd[1]: verity-setup.service: Deactivated successfully. Mar 20 18:03:51.902037 systemd[1]: Stopped verity-setup.service. Mar 20 18:03:51.910952 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:51.914572 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 18:03:51.917956 kernel: ACPI: bus type drm_connector registered Mar 20 18:03:51.916808 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 20 18:03:51.919355 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 20 18:03:51.922128 systemd[1]: Mounted media.mount - External Media Directory. Mar 20 18:03:51.923344 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 20 18:03:51.924537 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 20 18:03:51.925735 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 20 18:03:51.927046 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 18:03:51.928580 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 20 18:03:51.928807 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 20 18:03:51.930307 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 18:03:51.930599 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 18:03:51.932097 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 18:03:51.932312 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 18:03:51.933676 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 18:03:51.933925 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 18:03:51.935430 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 20 18:03:51.935640 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 20 18:03:51.937029 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 18:03:51.937233 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 18:03:51.938687 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 18:03:51.940210 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 20 18:03:51.941891 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 20 18:03:51.943621 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 20 18:03:51.957946 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 20 18:03:51.960673 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 20 18:03:51.962839 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 20 18:03:51.964030 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 20 18:03:51.964058 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 18:03:51.966029 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 20 18:03:51.973594 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 20 18:03:51.976897 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 20 18:03:51.978044 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 18:03:51.981031 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 20 18:03:51.983578 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 20 18:03:51.985074 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 18:03:51.988042 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 20 18:03:51.989157 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 18:03:51.992029 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 18:03:51.995150 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 20 18:03:51.998533 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 20 18:03:52.000224 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 20 18:03:52.002302 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 20 18:03:52.004107 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 20 18:03:52.008653 systemd-journald[1119]: Time spent on flushing to /var/log/journal/ea181759030a43b68cbf5e6e1c047280 is 17.211ms for 967 entries. Mar 20 18:03:52.008653 systemd-journald[1119]: System Journal (/var/log/journal/ea181759030a43b68cbf5e6e1c047280) is 8M, max 195.6M, 187.6M free. Mar 20 18:03:52.032828 systemd-journald[1119]: Received client request to flush runtime journal. Mar 20 18:03:52.032864 kernel: loop0: detected capacity change from 0 to 109808 Mar 20 18:03:52.017392 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 20 18:03:52.022105 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 20 18:03:52.026021 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 20 18:03:52.035052 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 20 18:03:52.038139 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 18:03:52.039909 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 20 18:03:52.047936 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 20 18:03:52.049487 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 18:03:52.055522 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 20 18:03:52.064575 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 20 18:03:52.074423 udevadm[1194]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 20 18:03:52.076929 kernel: loop1: detected capacity change from 0 to 151640 Mar 20 18:03:52.082108 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 20 18:03:52.085597 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 18:03:52.111052 kernel: loop2: detected capacity change from 0 to 210664 Mar 20 18:03:52.115265 systemd-tmpfiles[1198]: ACLs are not supported, ignoring. Mar 20 18:03:52.115282 systemd-tmpfiles[1198]: ACLs are not supported, ignoring. Mar 20 18:03:52.122539 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 18:03:52.147940 kernel: loop3: detected capacity change from 0 to 109808 Mar 20 18:03:52.156950 kernel: loop4: detected capacity change from 0 to 151640 Mar 20 18:03:52.171939 kernel: loop5: detected capacity change from 0 to 210664 Mar 20 18:03:52.179731 (sd-merge)[1203]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 20 18:03:52.180357 (sd-merge)[1203]: Merged extensions into '/usr'. Mar 20 18:03:52.186030 systemd[1]: Reload requested from client PID 1174 ('systemd-sysext') (unit systemd-sysext.service)... Mar 20 18:03:52.186050 systemd[1]: Reloading... Mar 20 18:03:52.243988 zram_generator::config[1233]: No configuration found. Mar 20 18:03:52.320331 ldconfig[1169]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 20 18:03:52.376180 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 18:03:52.441273 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 20 18:03:52.441942 systemd[1]: Reloading finished in 255 ms. Mar 20 18:03:52.466887 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 20 18:03:52.468445 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 20 18:03:52.484349 systemd[1]: Starting ensure-sysext.service... Mar 20 18:03:52.488358 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 18:03:52.508708 systemd-tmpfiles[1269]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 20 18:03:52.509030 systemd-tmpfiles[1269]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 20 18:03:52.509988 systemd-tmpfiles[1269]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 20 18:03:52.510261 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Mar 20 18:03:52.510339 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Mar 20 18:03:52.511115 systemd[1]: Reload requested from client PID 1268 ('systemctl') (unit ensure-sysext.service)... Mar 20 18:03:52.511133 systemd[1]: Reloading... Mar 20 18:03:52.514147 systemd-tmpfiles[1269]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 18:03:52.514160 systemd-tmpfiles[1269]: Skipping /boot Mar 20 18:03:52.527144 systemd-tmpfiles[1269]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 18:03:52.527271 systemd-tmpfiles[1269]: Skipping /boot Mar 20 18:03:52.572362 zram_generator::config[1304]: No configuration found. Mar 20 18:03:52.673030 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 18:03:52.737759 systemd[1]: Reloading finished in 226 ms. Mar 20 18:03:52.753585 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 20 18:03:52.771727 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 18:03:52.781760 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 18:03:52.784387 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 20 18:03:52.792104 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 20 18:03:52.795562 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 18:03:52.798583 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 18:03:52.802248 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 20 18:03:52.811984 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:52.812226 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 18:03:52.814415 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 18:03:52.818311 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 18:03:52.822367 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 18:03:52.823769 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 18:03:52.823892 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 18:03:52.834017 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 20 18:03:52.837409 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:52.839020 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 20 18:03:52.840125 systemd-udevd[1341]: Using default interface naming scheme 'v255'. Mar 20 18:03:52.841866 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 18:03:52.842091 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 18:03:52.844445 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 18:03:52.844657 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 18:03:52.846686 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 18:03:52.847005 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 18:03:52.858741 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:52.860995 augenrules[1370]: No rules Mar 20 18:03:52.860073 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 18:03:52.861662 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 18:03:52.865685 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 18:03:52.877335 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 18:03:52.878557 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 18:03:52.878663 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 18:03:52.879983 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 20 18:03:52.881065 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:52.882444 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 18:03:52.884363 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 18:03:52.884610 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 18:03:52.886322 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 20 18:03:52.888148 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 20 18:03:52.891522 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 20 18:03:52.893227 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 18:03:52.893448 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 18:03:52.895175 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 18:03:52.895378 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 18:03:52.899150 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 18:03:52.899491 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 18:03:52.902336 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 20 18:03:52.922565 systemd[1]: Finished ensure-sysext.service. Mar 20 18:03:52.927391 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:52.932036 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 18:03:52.933335 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 18:03:52.936314 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 18:03:52.943268 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 18:03:52.946116 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 18:03:52.953149 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 18:03:52.954545 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 18:03:52.954578 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 18:03:52.964007 augenrules[1411]: /sbin/augenrules: No change Mar 20 18:03:52.967892 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 18:03:52.970617 systemd-resolved[1340]: Positive Trust Anchors: Mar 20 18:03:52.970626 systemd-resolved[1340]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 18:03:52.970657 systemd-resolved[1340]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 18:03:52.973413 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 20 18:03:52.974938 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 20 18:03:52.974970 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 18:03:52.975776 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 18:03:52.985132 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1393) Mar 20 18:03:52.977022 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 18:03:52.985248 augenrules[1440]: No rules Mar 20 18:03:52.977290 systemd-resolved[1340]: Defaulting to hostname 'linux'. Mar 20 18:03:52.978627 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 18:03:52.978873 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 18:03:52.981152 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 18:03:52.982534 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 18:03:52.982755 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 18:03:52.987384 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 18:03:52.987621 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 18:03:52.989215 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 18:03:52.989443 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 18:03:52.997652 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 20 18:03:53.014309 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 18:03:53.022833 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 18:03:53.026052 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 20 18:03:53.027241 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 18:03:53.027308 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 18:03:53.046936 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 20 18:03:53.052209 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Mar 20 18:03:53.057936 kernel: ACPI: button: Power Button [PWRF] Mar 20 18:03:53.059685 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 20 18:03:53.063591 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 20 18:03:53.065891 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 20 18:03:53.068037 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 20 18:03:53.083043 systemd-networkd[1430]: lo: Link UP Mar 20 18:03:53.083053 systemd-networkd[1430]: lo: Gained carrier Mar 20 18:03:53.084795 systemd-networkd[1430]: Enumeration completed Mar 20 18:03:53.084892 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 18:03:53.085375 systemd-networkd[1430]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 18:03:53.085386 systemd-networkd[1430]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 18:03:53.092377 systemd[1]: Reached target network.target - Network. Mar 20 18:03:53.095279 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 20 18:03:53.097150 systemd-networkd[1430]: eth0: Link UP Mar 20 18:03:53.097161 systemd-networkd[1430]: eth0: Gained carrier Mar 20 18:03:53.097176 systemd-networkd[1430]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 18:03:53.098492 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 20 18:03:53.101152 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 20 18:03:53.102951 systemd[1]: Reached target time-set.target - System Time Set. Mar 20 18:03:53.108962 systemd-networkd[1430]: eth0: DHCPv4 address 10.0.0.124/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 20 18:03:53.110289 systemd-timesyncd[1431]: Network configuration changed, trying to establish connection. Mar 20 18:03:54.045388 systemd-resolved[1340]: Clock change detected. Flushing caches. Mar 20 18:03:54.045437 systemd-timesyncd[1431]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 20 18:03:54.045476 systemd-timesyncd[1431]: Initial clock synchronization to Thu 2025-03-20 18:03:54.045360 UTC. Mar 20 18:03:54.062781 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 20 18:03:54.074497 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 18:03:54.114321 kernel: mousedev: PS/2 mouse device common for all mice Mar 20 18:03:54.125717 kernel: kvm_amd: TSC scaling supported Mar 20 18:03:54.125752 kernel: kvm_amd: Nested Virtualization enabled Mar 20 18:03:54.125766 kernel: kvm_amd: Nested Paging enabled Mar 20 18:03:54.125790 kernel: kvm_amd: LBR virtualization supported Mar 20 18:03:54.126789 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Mar 20 18:03:54.126804 kernel: kvm_amd: Virtual GIF supported Mar 20 18:03:54.146324 kernel: EDAC MC: Ver: 3.0.0 Mar 20 18:03:54.178787 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 20 18:03:54.200547 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 20 18:03:54.202014 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 18:03:54.224855 lvm[1471]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 18:03:54.261236 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 20 18:03:54.262827 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 18:03:54.263975 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 18:03:54.265111 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 20 18:03:54.266365 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 20 18:03:54.267764 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 20 18:03:54.268921 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 20 18:03:54.270161 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 20 18:03:54.271385 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 20 18:03:54.271410 systemd[1]: Reached target paths.target - Path Units. Mar 20 18:03:54.272296 systemd[1]: Reached target timers.target - Timer Units. Mar 20 18:03:54.273948 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 20 18:03:54.276521 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 20 18:03:54.279780 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 20 18:03:54.281141 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 20 18:03:54.282355 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 20 18:03:54.285816 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 20 18:03:54.287188 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 20 18:03:54.289407 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 20 18:03:54.290940 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 20 18:03:54.292044 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 18:03:54.292969 systemd[1]: Reached target basic.target - Basic System. Mar 20 18:03:54.293903 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 20 18:03:54.293931 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 20 18:03:54.294843 systemd[1]: Starting containerd.service - containerd container runtime... Mar 20 18:03:54.296776 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 20 18:03:54.298466 lvm[1476]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 18:03:54.301367 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 20 18:03:54.303403 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 20 18:03:54.304416 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 20 18:03:54.305483 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 20 18:03:54.307524 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 20 18:03:54.309543 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 20 18:03:54.312066 jq[1479]: false Mar 20 18:03:54.312539 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 20 18:03:54.319555 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 20 18:03:54.321390 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 20 18:03:54.321855 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 20 18:03:54.325041 systemd[1]: Starting update-engine.service - Update Engine... Mar 20 18:03:54.328444 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 20 18:03:54.329751 dbus-daemon[1478]: [system] SELinux support is enabled Mar 20 18:03:54.330932 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 20 18:03:54.332698 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 20 18:03:54.340607 extend-filesystems[1480]: Found loop3 Mar 20 18:03:54.340607 extend-filesystems[1480]: Found loop4 Mar 20 18:03:54.340607 extend-filesystems[1480]: Found loop5 Mar 20 18:03:54.340607 extend-filesystems[1480]: Found sr0 Mar 20 18:03:54.340607 extend-filesystems[1480]: Found vda Mar 20 18:03:54.340607 extend-filesystems[1480]: Found vda1 Mar 20 18:03:54.340607 extend-filesystems[1480]: Found vda2 Mar 20 18:03:54.340607 extend-filesystems[1480]: Found vda3 Mar 20 18:03:54.340607 extend-filesystems[1480]: Found usr Mar 20 18:03:54.342321 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 20 18:03:54.355848 extend-filesystems[1480]: Found vda4 Mar 20 18:03:54.355848 extend-filesystems[1480]: Found vda6 Mar 20 18:03:54.355848 extend-filesystems[1480]: Found vda7 Mar 20 18:03:54.355848 extend-filesystems[1480]: Found vda9 Mar 20 18:03:54.355848 extend-filesystems[1480]: Checking size of /dev/vda9 Mar 20 18:03:54.342597 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 20 18:03:54.359753 update_engine[1486]: I20250320 18:03:54.355785 1486 main.cc:92] Flatcar Update Engine starting Mar 20 18:03:54.359753 update_engine[1486]: I20250320 18:03:54.356973 1486 update_check_scheduler.cc:74] Next update check in 5m27s Mar 20 18:03:54.344599 systemd[1]: motdgen.service: Deactivated successfully. Mar 20 18:03:54.360088 jq[1488]: true Mar 20 18:03:54.344836 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 20 18:03:54.357645 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 20 18:03:54.357939 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 20 18:03:54.363681 extend-filesystems[1480]: Resized partition /dev/vda9 Mar 20 18:03:54.372039 extend-filesystems[1511]: resize2fs 1.47.2 (1-Jan-2025) Mar 20 18:03:54.382903 jq[1504]: true Mar 20 18:03:54.384412 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 20 18:03:54.384033 (ntainerd)[1505]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 20 18:03:54.387315 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1404) Mar 20 18:03:54.395523 tar[1501]: linux-amd64/helm Mar 20 18:03:54.400040 systemd[1]: Started update-engine.service - Update Engine. Mar 20 18:03:54.401434 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 20 18:03:54.401458 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 20 18:03:54.404532 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 20 18:03:54.404558 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 20 18:03:54.413445 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 20 18:03:54.424314 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 20 18:03:54.450773 extend-filesystems[1511]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 20 18:03:54.450773 extend-filesystems[1511]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 20 18:03:54.450773 extend-filesystems[1511]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 20 18:03:54.448471 systemd-logind[1485]: Watching system buttons on /dev/input/event2 (Power Button) Mar 20 18:03:54.454777 extend-filesystems[1480]: Resized filesystem in /dev/vda9 Mar 20 18:03:54.448492 systemd-logind[1485]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 20 18:03:54.448952 systemd-logind[1485]: New seat seat0. Mar 20 18:03:54.451680 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 20 18:03:54.451952 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 20 18:03:54.459765 systemd[1]: Started systemd-logind.service - User Login Management. Mar 20 18:03:54.472819 bash[1533]: Updated "/home/core/.ssh/authorized_keys" Mar 20 18:03:54.473258 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 20 18:03:54.476863 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 20 18:03:54.479517 locksmithd[1520]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 20 18:03:54.581084 containerd[1505]: time="2025-03-20T18:03:54Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 20 18:03:54.582147 containerd[1505]: time="2025-03-20T18:03:54.582089023Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 20 18:03:54.594793 containerd[1505]: time="2025-03-20T18:03:54.594756027Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.97µs" Mar 20 18:03:54.594793 containerd[1505]: time="2025-03-20T18:03:54.594780182Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 20 18:03:54.594793 containerd[1505]: time="2025-03-20T18:03:54.594796773Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 20 18:03:54.594972 containerd[1505]: time="2025-03-20T18:03:54.594946094Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 20 18:03:54.594972 containerd[1505]: time="2025-03-20T18:03:54.594967113Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 20 18:03:54.595025 containerd[1505]: time="2025-03-20T18:03:54.594990277Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595071 containerd[1505]: time="2025-03-20T18:03:54.595051030Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595071 containerd[1505]: time="2025-03-20T18:03:54.595067111Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595350 containerd[1505]: time="2025-03-20T18:03:54.595290059Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595517 containerd[1505]: time="2025-03-20T18:03:54.595476609Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595568 containerd[1505]: time="2025-03-20T18:03:54.595519660Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595568 containerd[1505]: time="2025-03-20T18:03:54.595530550Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595669 containerd[1505]: time="2025-03-20T18:03:54.595649623Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595924 containerd[1505]: time="2025-03-20T18:03:54.595903159Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595969 containerd[1505]: time="2025-03-20T18:03:54.595938515Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 18:03:54.595969 containerd[1505]: time="2025-03-20T18:03:54.595949676Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 20 18:03:54.596015 containerd[1505]: time="2025-03-20T18:03:54.595982658Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 20 18:03:54.596296 containerd[1505]: time="2025-03-20T18:03:54.596256432Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 20 18:03:54.596408 containerd[1505]: time="2025-03-20T18:03:54.596383520Z" level=info msg="metadata content store policy set" policy=shared Mar 20 18:03:54.601947 containerd[1505]: time="2025-03-20T18:03:54.601914156Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 20 18:03:54.601985 containerd[1505]: time="2025-03-20T18:03:54.601953320Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 20 18:03:54.601985 containerd[1505]: time="2025-03-20T18:03:54.601969510Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 20 18:03:54.601985 containerd[1505]: time="2025-03-20T18:03:54.601984027Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 20 18:03:54.602063 containerd[1505]: time="2025-03-20T18:03:54.601999576Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 20 18:03:54.602063 containerd[1505]: time="2025-03-20T18:03:54.602016548Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 20 18:03:54.602063 containerd[1505]: time="2025-03-20T18:03:54.602030555Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 20 18:03:54.602063 containerd[1505]: time="2025-03-20T18:03:54.602043599Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 20 18:03:54.602063 containerd[1505]: time="2025-03-20T18:03:54.602054730Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 20 18:03:54.602063 containerd[1505]: time="2025-03-20T18:03:54.602065690Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 20 18:03:54.602186 containerd[1505]: time="2025-03-20T18:03:54.602077923Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 20 18:03:54.602186 containerd[1505]: time="2025-03-20T18:03:54.602090427Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 20 18:03:54.602225 containerd[1505]: time="2025-03-20T18:03:54.602207386Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 20 18:03:54.602245 containerd[1505]: time="2025-03-20T18:03:54.602226081Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 20 18:03:54.602245 containerd[1505]: time="2025-03-20T18:03:54.602238976Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 20 18:03:54.602282 containerd[1505]: time="2025-03-20T18:03:54.602249646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 20 18:03:54.602282 containerd[1505]: time="2025-03-20T18:03:54.602267629Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 20 18:03:54.602282 containerd[1505]: time="2025-03-20T18:03:54.602279181Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 20 18:03:54.602354 containerd[1505]: time="2025-03-20T18:03:54.602291544Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 20 18:03:54.602354 containerd[1505]: time="2025-03-20T18:03:54.602328884Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 20 18:03:54.602354 containerd[1505]: time="2025-03-20T18:03:54.602342630Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 20 18:03:54.602422 containerd[1505]: time="2025-03-20T18:03:54.602355414Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 20 18:03:54.602422 containerd[1505]: time="2025-03-20T18:03:54.602372897Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 20 18:03:54.602460 containerd[1505]: time="2025-03-20T18:03:54.602437328Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 20 18:03:54.602460 containerd[1505]: time="2025-03-20T18:03:54.602451514Z" level=info msg="Start snapshots syncer" Mar 20 18:03:54.602501 containerd[1505]: time="2025-03-20T18:03:54.602486179Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 20 18:03:54.602730 containerd[1505]: time="2025-03-20T18:03:54.602688659Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 20 18:03:54.602838 containerd[1505]: time="2025-03-20T18:03:54.602732632Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 20 18:03:54.602838 containerd[1505]: time="2025-03-20T18:03:54.602800239Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 20 18:03:54.602925 containerd[1505]: time="2025-03-20T18:03:54.602895607Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 20 18:03:54.602925 containerd[1505]: time="2025-03-20T18:03:54.602921286Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 20 18:03:54.602968 containerd[1505]: time="2025-03-20T18:03:54.602934420Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 20 18:03:54.602968 containerd[1505]: time="2025-03-20T18:03:54.602945201Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 20 18:03:54.602968 containerd[1505]: time="2025-03-20T18:03:54.602958746Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 20 18:03:54.603030 containerd[1505]: time="2025-03-20T18:03:54.602970368Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 20 18:03:54.603030 containerd[1505]: time="2025-03-20T18:03:54.602981709Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 20 18:03:54.603030 containerd[1505]: time="2025-03-20T18:03:54.603003570Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 20 18:03:54.603030 containerd[1505]: time="2025-03-20T18:03:54.603018608Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 20 18:03:54.603030 containerd[1505]: time="2025-03-20T18:03:54.603029108Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603841201Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603865237Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603877480Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603887178Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603895934Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603905823Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603916743Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 20 18:03:54.603929 containerd[1505]: time="2025-03-20T18:03:54.603937121Z" level=info msg="runtime interface created" Mar 20 18:03:54.604095 containerd[1505]: time="2025-03-20T18:03:54.603943684Z" level=info msg="created NRI interface" Mar 20 18:03:54.604095 containerd[1505]: time="2025-03-20T18:03:54.603957540Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 20 18:03:54.604095 containerd[1505]: time="2025-03-20T18:03:54.603968400Z" level=info msg="Connect containerd service" Mar 20 18:03:54.604095 containerd[1505]: time="2025-03-20T18:03:54.603990602Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 20 18:03:54.605437 containerd[1505]: time="2025-03-20T18:03:54.605406939Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 18:03:54.622083 sshd_keygen[1493]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 20 18:03:54.645563 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 20 18:03:54.649404 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 20 18:03:54.666022 systemd[1]: issuegen.service: Deactivated successfully. Mar 20 18:03:54.666311 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 20 18:03:54.669266 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 20 18:03:54.690127 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 20 18:03:54.693689 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.695882247Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.695953441Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.695987154Z" level=info msg="Start subscribing containerd event" Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696023512Z" level=info msg="Start recovering state" Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696124071Z" level=info msg="Start event monitor" Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696136875Z" level=info msg="Start cni network conf syncer for default" Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696147825Z" level=info msg="Start streaming server" Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696155820Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696164026Z" level=info msg="runtime interface starting up..." Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696173063Z" level=info msg="starting plugins..." Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.696187460Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 20 18:03:54.697871 containerd[1505]: time="2025-03-20T18:03:54.697285590Z" level=info msg="containerd successfully booted in 0.116681s" Mar 20 18:03:54.696550 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 20 18:03:54.698432 systemd[1]: Reached target getty.target - Login Prompts. Mar 20 18:03:54.701531 systemd[1]: Started containerd.service - containerd container runtime. Mar 20 18:03:54.807832 tar[1501]: linux-amd64/LICENSE Mar 20 18:03:54.807964 tar[1501]: linux-amd64/README.md Mar 20 18:03:54.831192 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 20 18:03:55.645529 systemd-networkd[1430]: eth0: Gained IPv6LL Mar 20 18:03:55.649338 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 20 18:03:55.651602 systemd[1]: Reached target network-online.target - Network is Online. Mar 20 18:03:55.654804 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 20 18:03:55.657710 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 18:03:55.667569 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 20 18:03:55.693712 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 20 18:03:55.695459 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 20 18:03:55.695709 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 20 18:03:55.698024 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 20 18:03:56.755568 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:03:56.757282 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 20 18:03:56.758703 systemd[1]: Startup finished in 673ms (kernel) + 5.365s (initrd) + 4.793s (userspace) = 10.832s. Mar 20 18:03:56.791701 (kubelet)[1606]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 18:03:57.224274 kubelet[1606]: E0320 18:03:57.224131 1606 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 18:03:57.228686 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 18:03:57.228888 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 18:03:57.229277 systemd[1]: kubelet.service: Consumed 1.421s CPU time, 243.8M memory peak. Mar 20 18:03:59.555884 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 20 18:03:59.557418 systemd[1]: Started sshd@0-10.0.0.124:22-10.0.0.1:36184.service - OpenSSH per-connection server daemon (10.0.0.1:36184). Mar 20 18:03:59.621027 sshd[1621]: Accepted publickey for core from 10.0.0.1 port 36184 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:03:59.623132 sshd-session[1621]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:03:59.635905 systemd-logind[1485]: New session 1 of user core. Mar 20 18:03:59.637562 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 20 18:03:59.639116 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 20 18:03:59.666697 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 20 18:03:59.669566 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 20 18:03:59.687704 (systemd)[1625]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 20 18:03:59.690294 systemd-logind[1485]: New session c1 of user core. Mar 20 18:03:59.829763 systemd[1625]: Queued start job for default target default.target. Mar 20 18:03:59.842665 systemd[1625]: Created slice app.slice - User Application Slice. Mar 20 18:03:59.842690 systemd[1625]: Reached target paths.target - Paths. Mar 20 18:03:59.842732 systemd[1625]: Reached target timers.target - Timers. Mar 20 18:03:59.844259 systemd[1625]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 20 18:03:59.854831 systemd[1625]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 20 18:03:59.854969 systemd[1625]: Reached target sockets.target - Sockets. Mar 20 18:03:59.855030 systemd[1625]: Reached target basic.target - Basic System. Mar 20 18:03:59.855088 systemd[1625]: Reached target default.target - Main User Target. Mar 20 18:03:59.855129 systemd[1625]: Startup finished in 157ms. Mar 20 18:03:59.855520 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 20 18:03:59.857231 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 20 18:03:59.919784 systemd[1]: Started sshd@1-10.0.0.124:22-10.0.0.1:36200.service - OpenSSH per-connection server daemon (10.0.0.1:36200). Mar 20 18:03:59.975831 sshd[1636]: Accepted publickey for core from 10.0.0.1 port 36200 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:03:59.977271 sshd-session[1636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:03:59.981474 systemd-logind[1485]: New session 2 of user core. Mar 20 18:04:00.001430 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 20 18:04:00.054268 sshd[1638]: Connection closed by 10.0.0.1 port 36200 Mar 20 18:04:00.054598 sshd-session[1636]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:00.065003 systemd[1]: sshd@1-10.0.0.124:22-10.0.0.1:36200.service: Deactivated successfully. Mar 20 18:04:00.066754 systemd[1]: session-2.scope: Deactivated successfully. Mar 20 18:04:00.068417 systemd-logind[1485]: Session 2 logged out. Waiting for processes to exit. Mar 20 18:04:00.069601 systemd[1]: Started sshd@2-10.0.0.124:22-10.0.0.1:36202.service - OpenSSH per-connection server daemon (10.0.0.1:36202). Mar 20 18:04:00.070315 systemd-logind[1485]: Removed session 2. Mar 20 18:04:00.123652 sshd[1643]: Accepted publickey for core from 10.0.0.1 port 36202 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:00.124974 sshd-session[1643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:00.128833 systemd-logind[1485]: New session 3 of user core. Mar 20 18:04:00.138419 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 20 18:04:00.187610 sshd[1646]: Connection closed by 10.0.0.1 port 36202 Mar 20 18:04:00.187965 sshd-session[1643]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:00.205843 systemd[1]: sshd@2-10.0.0.124:22-10.0.0.1:36202.service: Deactivated successfully. Mar 20 18:04:00.207646 systemd[1]: session-3.scope: Deactivated successfully. Mar 20 18:04:00.209335 systemd-logind[1485]: Session 3 logged out. Waiting for processes to exit. Mar 20 18:04:00.210575 systemd[1]: Started sshd@3-10.0.0.124:22-10.0.0.1:36208.service - OpenSSH per-connection server daemon (10.0.0.1:36208). Mar 20 18:04:00.211502 systemd-logind[1485]: Removed session 3. Mar 20 18:04:00.258946 sshd[1651]: Accepted publickey for core from 10.0.0.1 port 36208 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:00.260428 sshd-session[1651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:00.264549 systemd-logind[1485]: New session 4 of user core. Mar 20 18:04:00.274420 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 20 18:04:00.326710 sshd[1654]: Connection closed by 10.0.0.1 port 36208 Mar 20 18:04:00.327029 sshd-session[1651]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:00.339179 systemd[1]: sshd@3-10.0.0.124:22-10.0.0.1:36208.service: Deactivated successfully. Mar 20 18:04:00.341061 systemd[1]: session-4.scope: Deactivated successfully. Mar 20 18:04:00.342575 systemd-logind[1485]: Session 4 logged out. Waiting for processes to exit. Mar 20 18:04:00.343934 systemd[1]: Started sshd@4-10.0.0.124:22-10.0.0.1:36212.service - OpenSSH per-connection server daemon (10.0.0.1:36212). Mar 20 18:04:00.344763 systemd-logind[1485]: Removed session 4. Mar 20 18:04:00.401836 sshd[1659]: Accepted publickey for core from 10.0.0.1 port 36212 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:00.403269 sshd-session[1659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:00.407403 systemd-logind[1485]: New session 5 of user core. Mar 20 18:04:00.417423 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 20 18:04:00.475438 sudo[1663]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 20 18:04:00.475806 sudo[1663]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 18:04:00.496878 sudo[1663]: pam_unix(sudo:session): session closed for user root Mar 20 18:04:00.498442 sshd[1662]: Connection closed by 10.0.0.1 port 36212 Mar 20 18:04:00.498810 sshd-session[1659]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:00.511124 systemd[1]: sshd@4-10.0.0.124:22-10.0.0.1:36212.service: Deactivated successfully. Mar 20 18:04:00.512958 systemd[1]: session-5.scope: Deactivated successfully. Mar 20 18:04:00.514641 systemd-logind[1485]: Session 5 logged out. Waiting for processes to exit. Mar 20 18:04:00.516015 systemd[1]: Started sshd@5-10.0.0.124:22-10.0.0.1:36220.service - OpenSSH per-connection server daemon (10.0.0.1:36220). Mar 20 18:04:00.516740 systemd-logind[1485]: Removed session 5. Mar 20 18:04:00.566759 sshd[1668]: Accepted publickey for core from 10.0.0.1 port 36220 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:00.568176 sshd-session[1668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:00.572427 systemd-logind[1485]: New session 6 of user core. Mar 20 18:04:00.582418 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 20 18:04:00.635628 sudo[1673]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 20 18:04:00.635933 sudo[1673]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 18:04:00.639567 sudo[1673]: pam_unix(sudo:session): session closed for user root Mar 20 18:04:00.646370 sudo[1672]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 20 18:04:00.646712 sudo[1672]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 18:04:00.656281 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 18:04:00.694640 augenrules[1695]: No rules Mar 20 18:04:00.696413 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 18:04:00.696677 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 18:04:00.697827 sudo[1672]: pam_unix(sudo:session): session closed for user root Mar 20 18:04:00.699318 sshd[1671]: Connection closed by 10.0.0.1 port 36220 Mar 20 18:04:00.699612 sshd-session[1668]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:00.708021 systemd[1]: sshd@5-10.0.0.124:22-10.0.0.1:36220.service: Deactivated successfully. Mar 20 18:04:00.709810 systemd[1]: session-6.scope: Deactivated successfully. Mar 20 18:04:00.711472 systemd-logind[1485]: Session 6 logged out. Waiting for processes to exit. Mar 20 18:04:00.712660 systemd[1]: Started sshd@6-10.0.0.124:22-10.0.0.1:36226.service - OpenSSH per-connection server daemon (10.0.0.1:36226). Mar 20 18:04:00.713369 systemd-logind[1485]: Removed session 6. Mar 20 18:04:00.758498 sshd[1703]: Accepted publickey for core from 10.0.0.1 port 36226 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:00.759900 sshd-session[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:00.763928 systemd-logind[1485]: New session 7 of user core. Mar 20 18:04:00.775439 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 20 18:04:00.828818 sudo[1707]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 20 18:04:00.829244 sudo[1707]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 18:04:01.117646 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 20 18:04:01.131619 (dockerd)[1727]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 20 18:04:01.383099 dockerd[1727]: time="2025-03-20T18:04:01.382973147Z" level=info msg="Starting up" Mar 20 18:04:01.384146 dockerd[1727]: time="2025-03-20T18:04:01.383775532Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 20 18:04:01.690197 dockerd[1727]: time="2025-03-20T18:04:01.690064462Z" level=info msg="Loading containers: start." Mar 20 18:04:01.858327 kernel: Initializing XFRM netlink socket Mar 20 18:04:01.929116 systemd-networkd[1430]: docker0: Link UP Mar 20 18:04:01.998889 dockerd[1727]: time="2025-03-20T18:04:01.998763634Z" level=info msg="Loading containers: done." Mar 20 18:04:02.012557 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3220323783-merged.mount: Deactivated successfully. Mar 20 18:04:02.014326 dockerd[1727]: time="2025-03-20T18:04:02.014262831Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 20 18:04:02.014392 dockerd[1727]: time="2025-03-20T18:04:02.014372106Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 20 18:04:02.014497 dockerd[1727]: time="2025-03-20T18:04:02.014478576Z" level=info msg="Daemon has completed initialization" Mar 20 18:04:02.049238 dockerd[1727]: time="2025-03-20T18:04:02.049174378Z" level=info msg="API listen on /run/docker.sock" Mar 20 18:04:02.049314 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 20 18:04:02.731987 containerd[1505]: time="2025-03-20T18:04:02.731949333Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 20 18:04:03.364896 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1379121474.mount: Deactivated successfully. Mar 20 18:04:04.553869 containerd[1505]: time="2025-03-20T18:04:04.553804082Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:04.554623 containerd[1505]: time="2025-03-20T18:04:04.554522540Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674573" Mar 20 18:04:04.555678 containerd[1505]: time="2025-03-20T18:04:04.555644715Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:04.558319 containerd[1505]: time="2025-03-20T18:04:04.558271273Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:04.559038 containerd[1505]: time="2025-03-20T18:04:04.559007945Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 1.827021472s" Mar 20 18:04:04.559074 containerd[1505]: time="2025-03-20T18:04:04.559037080Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 20 18:04:04.578779 containerd[1505]: time="2025-03-20T18:04:04.578730356Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 20 18:04:06.550907 containerd[1505]: time="2025-03-20T18:04:06.550836087Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:06.551691 containerd[1505]: time="2025-03-20T18:04:06.551608776Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619772" Mar 20 18:04:06.552932 containerd[1505]: time="2025-03-20T18:04:06.552903746Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:06.555182 containerd[1505]: time="2025-03-20T18:04:06.555152875Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:06.555951 containerd[1505]: time="2025-03-20T18:04:06.555917891Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 1.977140968s" Mar 20 18:04:06.555951 containerd[1505]: time="2025-03-20T18:04:06.555948237Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 20 18:04:06.573949 containerd[1505]: time="2025-03-20T18:04:06.573918401Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 20 18:04:07.288617 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 20 18:04:07.290185 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 18:04:07.496462 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:04:07.514606 (kubelet)[2026]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 18:04:07.555821 kubelet[2026]: E0320 18:04:07.555688 2026 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 18:04:07.563378 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 18:04:07.563581 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 18:04:07.563982 systemd[1]: kubelet.service: Consumed 221ms CPU time, 99M memory peak. Mar 20 18:04:07.959121 containerd[1505]: time="2025-03-20T18:04:07.958987629Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:07.959994 containerd[1505]: time="2025-03-20T18:04:07.959912615Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903309" Mar 20 18:04:07.961060 containerd[1505]: time="2025-03-20T18:04:07.961026705Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:07.963453 containerd[1505]: time="2025-03-20T18:04:07.963423391Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:07.964291 containerd[1505]: time="2025-03-20T18:04:07.964250784Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 1.390293369s" Mar 20 18:04:07.964291 containerd[1505]: time="2025-03-20T18:04:07.964290518Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 20 18:04:07.982367 containerd[1505]: time="2025-03-20T18:04:07.982318209Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 20 18:04:09.275393 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount702667946.mount: Deactivated successfully. Mar 20 18:04:09.519401 containerd[1505]: time="2025-03-20T18:04:09.519327819Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:09.521345 containerd[1505]: time="2025-03-20T18:04:09.521287476Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185372" Mar 20 18:04:09.523909 containerd[1505]: time="2025-03-20T18:04:09.523227595Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:09.525744 containerd[1505]: time="2025-03-20T18:04:09.525638989Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:09.526397 containerd[1505]: time="2025-03-20T18:04:09.526354552Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 1.543984074s" Mar 20 18:04:09.526441 containerd[1505]: time="2025-03-20T18:04:09.526396791Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 20 18:04:09.546354 containerd[1505]: time="2025-03-20T18:04:09.546325308Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 20 18:04:10.082341 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4204328589.mount: Deactivated successfully. Mar 20 18:04:11.096653 containerd[1505]: time="2025-03-20T18:04:11.096594524Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:11.097579 containerd[1505]: time="2025-03-20T18:04:11.097534277Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Mar 20 18:04:11.098570 containerd[1505]: time="2025-03-20T18:04:11.098538561Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:11.100928 containerd[1505]: time="2025-03-20T18:04:11.100885735Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:11.101835 containerd[1505]: time="2025-03-20T18:04:11.101804889Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.555449284s" Mar 20 18:04:11.101835 containerd[1505]: time="2025-03-20T18:04:11.101833583Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 20 18:04:11.119190 containerd[1505]: time="2025-03-20T18:04:11.119143307Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 20 18:04:11.616871 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount113416672.mount: Deactivated successfully. Mar 20 18:04:11.622667 containerd[1505]: time="2025-03-20T18:04:11.622617771Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:11.623399 containerd[1505]: time="2025-03-20T18:04:11.623350225Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Mar 20 18:04:11.624563 containerd[1505]: time="2025-03-20T18:04:11.624532123Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:11.626824 containerd[1505]: time="2025-03-20T18:04:11.626798294Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:11.627740 containerd[1505]: time="2025-03-20T18:04:11.627692452Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 508.503539ms" Mar 20 18:04:11.627777 containerd[1505]: time="2025-03-20T18:04:11.627744990Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 20 18:04:11.645330 containerd[1505]: time="2025-03-20T18:04:11.645279075Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 20 18:04:12.151757 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3370520245.mount: Deactivated successfully. Mar 20 18:04:13.991885 containerd[1505]: time="2025-03-20T18:04:13.991825744Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:13.994829 containerd[1505]: time="2025-03-20T18:04:13.994531761Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238571" Mar 20 18:04:13.996209 containerd[1505]: time="2025-03-20T18:04:13.996162020Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:13.998867 containerd[1505]: time="2025-03-20T18:04:13.998838010Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:13.999998 containerd[1505]: time="2025-03-20T18:04:13.999964383Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 2.354579699s" Mar 20 18:04:14.000058 containerd[1505]: time="2025-03-20T18:04:13.999999048Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 20 18:04:15.877871 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:04:15.878046 systemd[1]: kubelet.service: Consumed 221ms CPU time, 99M memory peak. Mar 20 18:04:15.880266 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 18:04:15.907353 systemd[1]: Reload requested from client PID 2276 ('systemctl') (unit session-7.scope)... Mar 20 18:04:15.907367 systemd[1]: Reloading... Mar 20 18:04:16.000346 zram_generator::config[2323]: No configuration found. Mar 20 18:04:16.135799 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 18:04:16.242452 systemd[1]: Reloading finished in 334 ms. Mar 20 18:04:16.303249 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:04:16.304973 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 18:04:16.308463 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 18:04:16.308724 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:04:16.308758 systemd[1]: kubelet.service: Consumed 139ms CPU time, 83.6M memory peak. Mar 20 18:04:16.310315 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 18:04:16.483415 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:04:16.495677 (kubelet)[2369]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 18:04:16.536754 kubelet[2369]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 18:04:16.536754 kubelet[2369]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 18:04:16.536754 kubelet[2369]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 18:04:16.537632 kubelet[2369]: I0320 18:04:16.537581 2369 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 18:04:16.699077 kubelet[2369]: I0320 18:04:16.699024 2369 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 20 18:04:16.699077 kubelet[2369]: I0320 18:04:16.699056 2369 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 18:04:16.699325 kubelet[2369]: I0320 18:04:16.699292 2369 server.go:927] "Client rotation is on, will bootstrap in background" Mar 20 18:04:16.713654 kubelet[2369]: I0320 18:04:16.713600 2369 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 18:04:16.714367 kubelet[2369]: E0320 18:04:16.714337 2369 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.124:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.727680 kubelet[2369]: I0320 18:04:16.727630 2369 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 18:04:16.728874 kubelet[2369]: I0320 18:04:16.728821 2369 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 18:04:16.729037 kubelet[2369]: I0320 18:04:16.728856 2369 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 20 18:04:16.729515 kubelet[2369]: I0320 18:04:16.729478 2369 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 18:04:16.729515 kubelet[2369]: I0320 18:04:16.729503 2369 container_manager_linux.go:301] "Creating device plugin manager" Mar 20 18:04:16.730399 kubelet[2369]: I0320 18:04:16.730362 2369 state_mem.go:36] "Initialized new in-memory state store" Mar 20 18:04:16.731098 kubelet[2369]: I0320 18:04:16.731064 2369 kubelet.go:400] "Attempting to sync node with API server" Mar 20 18:04:16.731098 kubelet[2369]: I0320 18:04:16.731085 2369 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 18:04:16.731174 kubelet[2369]: I0320 18:04:16.731112 2369 kubelet.go:312] "Adding apiserver pod source" Mar 20 18:04:16.731174 kubelet[2369]: I0320 18:04:16.731135 2369 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 18:04:16.737198 kubelet[2369]: W0320 18:04:16.736182 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.124:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.737198 kubelet[2369]: E0320 18:04:16.736261 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.124:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.737198 kubelet[2369]: W0320 18:04:16.736876 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.124:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.737198 kubelet[2369]: E0320 18:04:16.736924 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.124:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.737198 kubelet[2369]: I0320 18:04:16.736996 2369 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 18:04:16.739066 kubelet[2369]: I0320 18:04:16.739039 2369 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 18:04:16.739127 kubelet[2369]: W0320 18:04:16.739106 2369 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 20 18:04:16.740027 kubelet[2369]: I0320 18:04:16.740004 2369 server.go:1264] "Started kubelet" Mar 20 18:04:16.741521 kubelet[2369]: I0320 18:04:16.740137 2369 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 18:04:16.745031 kubelet[2369]: I0320 18:04:16.744836 2369 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 18:04:16.745031 kubelet[2369]: I0320 18:04:16.744925 2369 server.go:455] "Adding debug handlers to kubelet server" Mar 20 18:04:16.745127 kubelet[2369]: I0320 18:04:16.745107 2369 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 18:04:16.745432 kubelet[2369]: I0320 18:04:16.745415 2369 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 18:04:16.747560 kubelet[2369]: E0320 18:04:16.747225 2369 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 18:04:16.747560 kubelet[2369]: E0320 18:04:16.747273 2369 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 20 18:04:16.747560 kubelet[2369]: I0320 18:04:16.747312 2369 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 20 18:04:16.747560 kubelet[2369]: I0320 18:04:16.747404 2369 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 20 18:04:16.747560 kubelet[2369]: I0320 18:04:16.747441 2369 reconciler.go:26] "Reconciler: start to sync state" Mar 20 18:04:16.748099 kubelet[2369]: W0320 18:04:16.748040 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.124:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.748099 kubelet[2369]: E0320 18:04:16.748084 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.124:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.748371 kubelet[2369]: E0320 18:04:16.748345 2369 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.124:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.124:6443: connect: connection refused" interval="200ms" Mar 20 18:04:16.748907 kubelet[2369]: I0320 18:04:16.748885 2369 factory.go:221] Registration of the systemd container factory successfully Mar 20 18:04:16.748984 kubelet[2369]: I0320 18:04:16.748964 2369 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 18:04:16.750166 kubelet[2369]: I0320 18:04:16.750146 2369 factory.go:221] Registration of the containerd container factory successfully Mar 20 18:04:16.752257 kubelet[2369]: E0320 18:04:16.750749 2369 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.124:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.124:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182e94ee3a498ce4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-20 18:04:16.73998666 +0000 UTC m=+0.240098363,LastTimestamp:2025-03-20 18:04:16.73998666 +0000 UTC m=+0.240098363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 20 18:04:16.761508 kubelet[2369]: I0320 18:04:16.761464 2369 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 18:04:16.763283 kubelet[2369]: I0320 18:04:16.763237 2369 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 18:04:16.763283 kubelet[2369]: I0320 18:04:16.763281 2369 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 18:04:16.763435 kubelet[2369]: I0320 18:04:16.763321 2369 kubelet.go:2337] "Starting kubelet main sync loop" Mar 20 18:04:16.763435 kubelet[2369]: E0320 18:04:16.763359 2369 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 18:04:16.767717 kubelet[2369]: W0320 18:04:16.767650 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.124:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.768139 kubelet[2369]: E0320 18:04:16.768115 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.124:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:16.768528 kubelet[2369]: I0320 18:04:16.768467 2369 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 18:04:16.768528 kubelet[2369]: I0320 18:04:16.768481 2369 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 18:04:16.768528 kubelet[2369]: I0320 18:04:16.768498 2369 state_mem.go:36] "Initialized new in-memory state store" Mar 20 18:04:16.848954 kubelet[2369]: I0320 18:04:16.848899 2369 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 20 18:04:16.849370 kubelet[2369]: E0320 18:04:16.849327 2369 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.124:6443/api/v1/nodes\": dial tcp 10.0.0.124:6443: connect: connection refused" node="localhost" Mar 20 18:04:16.864393 kubelet[2369]: E0320 18:04:16.864365 2369 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 18:04:16.949012 kubelet[2369]: E0320 18:04:16.948985 2369 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.124:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.124:6443: connect: connection refused" interval="400ms" Mar 20 18:04:16.994607 kubelet[2369]: I0320 18:04:16.994508 2369 policy_none.go:49] "None policy: Start" Mar 20 18:04:16.995572 kubelet[2369]: I0320 18:04:16.995543 2369 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 18:04:16.995572 kubelet[2369]: I0320 18:04:16.995580 2369 state_mem.go:35] "Initializing new in-memory state store" Mar 20 18:04:17.001337 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 20 18:04:17.021168 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 20 18:04:17.023991 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 20 18:04:17.039121 kubelet[2369]: I0320 18:04:17.039090 2369 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 18:04:17.039356 kubelet[2369]: I0320 18:04:17.039312 2369 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 18:04:17.039564 kubelet[2369]: I0320 18:04:17.039434 2369 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 18:04:17.040185 kubelet[2369]: E0320 18:04:17.040159 2369 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 20 18:04:17.050888 kubelet[2369]: I0320 18:04:17.050857 2369 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 20 18:04:17.051159 kubelet[2369]: E0320 18:04:17.051127 2369 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.124:6443/api/v1/nodes\": dial tcp 10.0.0.124:6443: connect: connection refused" node="localhost" Mar 20 18:04:17.065294 kubelet[2369]: I0320 18:04:17.065247 2369 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 20 18:04:17.066109 kubelet[2369]: I0320 18:04:17.066080 2369 topology_manager.go:215] "Topology Admit Handler" podUID="6f043f4f1fbe8eb46b6b63f672b28480" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 20 18:04:17.066804 kubelet[2369]: I0320 18:04:17.066770 2369 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 20 18:04:17.072784 systemd[1]: Created slice kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice - libcontainer container kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice. Mar 20 18:04:17.094705 systemd[1]: Created slice kubepods-burstable-pod6f043f4f1fbe8eb46b6b63f672b28480.slice - libcontainer container kubepods-burstable-pod6f043f4f1fbe8eb46b6b63f672b28480.slice. Mar 20 18:04:17.108900 systemd[1]: Created slice kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice - libcontainer container kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice. Mar 20 18:04:17.150072 kubelet[2369]: I0320 18:04:17.150020 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:17.150072 kubelet[2369]: I0320 18:04:17.150069 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:17.150224 kubelet[2369]: I0320 18:04:17.150090 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:17.150224 kubelet[2369]: I0320 18:04:17.150112 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 20 18:04:17.150224 kubelet[2369]: I0320 18:04:17.150155 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6f043f4f1fbe8eb46b6b63f672b28480-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"6f043f4f1fbe8eb46b6b63f672b28480\") " pod="kube-system/kube-apiserver-localhost" Mar 20 18:04:17.150224 kubelet[2369]: I0320 18:04:17.150171 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:17.150224 kubelet[2369]: I0320 18:04:17.150187 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:17.150367 kubelet[2369]: I0320 18:04:17.150220 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6f043f4f1fbe8eb46b6b63f672b28480-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"6f043f4f1fbe8eb46b6b63f672b28480\") " pod="kube-system/kube-apiserver-localhost" Mar 20 18:04:17.150367 kubelet[2369]: I0320 18:04:17.150239 2369 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6f043f4f1fbe8eb46b6b63f672b28480-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"6f043f4f1fbe8eb46b6b63f672b28480\") " pod="kube-system/kube-apiserver-localhost" Mar 20 18:04:17.350463 kubelet[2369]: E0320 18:04:17.350427 2369 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.124:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.124:6443: connect: connection refused" interval="800ms" Mar 20 18:04:17.394196 containerd[1505]: time="2025-03-20T18:04:17.394159405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,}" Mar 20 18:04:17.407756 containerd[1505]: time="2025-03-20T18:04:17.407726097Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:6f043f4f1fbe8eb46b6b63f672b28480,Namespace:kube-system,Attempt:0,}" Mar 20 18:04:17.414221 containerd[1505]: time="2025-03-20T18:04:17.414188692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,}" Mar 20 18:04:17.452591 kubelet[2369]: I0320 18:04:17.452545 2369 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 20 18:04:17.452822 kubelet[2369]: E0320 18:04:17.452797 2369 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.124:6443/api/v1/nodes\": dial tcp 10.0.0.124:6443: connect: connection refused" node="localhost" Mar 20 18:04:17.787621 kubelet[2369]: W0320 18:04:17.787493 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.124:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:17.787621 kubelet[2369]: E0320 18:04:17.787543 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.124:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:17.792885 kubelet[2369]: W0320 18:04:17.792847 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.124:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:17.792885 kubelet[2369]: E0320 18:04:17.792882 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.124:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:17.851497 kubelet[2369]: W0320 18:04:17.851438 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.124:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:17.851497 kubelet[2369]: E0320 18:04:17.851496 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.124:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:18.031891 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3490833498.mount: Deactivated successfully. Mar 20 18:04:18.038076 containerd[1505]: time="2025-03-20T18:04:18.038031826Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 18:04:18.042384 containerd[1505]: time="2025-03-20T18:04:18.042315202Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Mar 20 18:04:18.043210 containerd[1505]: time="2025-03-20T18:04:18.043173081Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 18:04:18.044948 containerd[1505]: time="2025-03-20T18:04:18.044916352Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 18:04:18.045615 containerd[1505]: time="2025-03-20T18:04:18.045545642Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 20 18:04:18.046565 containerd[1505]: time="2025-03-20T18:04:18.046530079Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 18:04:18.047392 containerd[1505]: time="2025-03-20T18:04:18.047354085Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 20 18:04:18.048460 containerd[1505]: time="2025-03-20T18:04:18.048425114Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 18:04:18.048974 containerd[1505]: time="2025-03-20T18:04:18.048945020Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 651.954823ms" Mar 20 18:04:18.050056 containerd[1505]: time="2025-03-20T18:04:18.050028743Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 640.25808ms" Mar 20 18:04:18.054078 containerd[1505]: time="2025-03-20T18:04:18.054046911Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 635.849088ms" Mar 20 18:04:18.075642 containerd[1505]: time="2025-03-20T18:04:18.075601329Z" level=info msg="connecting to shim 470503960d228ab6a28d0560fc577b0c90061954deba16adf45d9a2730617540" address="unix:///run/containerd/s/9d5b967e8f5d81725dc82f40f9aa0cda4dafd7ac09c85bf324815c0e4436d1d6" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:04:18.076742 containerd[1505]: time="2025-03-20T18:04:18.076542024Z" level=info msg="connecting to shim e84d962106a7ae6f64c05cdebf49306220d75b27db00cc4d6930b6bf93c784f1" address="unix:///run/containerd/s/cf38bb78d0c0b6fdb74928a160dba40b02866f80832befb4a8b9e0853ec86fe9" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:04:18.086087 containerd[1505]: time="2025-03-20T18:04:18.086043299Z" level=info msg="connecting to shim dc0fb6b06c47905910ce2e5afd2570e3965f5756a9140109aa1fb82b4a26aab2" address="unix:///run/containerd/s/b4f20d54066430666f0d139ddc31be980a3de93a0e01e7c2b8862dff44db9b17" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:04:18.099434 systemd[1]: Started cri-containerd-e84d962106a7ae6f64c05cdebf49306220d75b27db00cc4d6930b6bf93c784f1.scope - libcontainer container e84d962106a7ae6f64c05cdebf49306220d75b27db00cc4d6930b6bf93c784f1. Mar 20 18:04:18.103154 systemd[1]: Started cri-containerd-470503960d228ab6a28d0560fc577b0c90061954deba16adf45d9a2730617540.scope - libcontainer container 470503960d228ab6a28d0560fc577b0c90061954deba16adf45d9a2730617540. Mar 20 18:04:18.108254 systemd[1]: Started cri-containerd-dc0fb6b06c47905910ce2e5afd2570e3965f5756a9140109aa1fb82b4a26aab2.scope - libcontainer container dc0fb6b06c47905910ce2e5afd2570e3965f5756a9140109aa1fb82b4a26aab2. Mar 20 18:04:18.142037 containerd[1505]: time="2025-03-20T18:04:18.141991552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:6f043f4f1fbe8eb46b6b63f672b28480,Namespace:kube-system,Attempt:0,} returns sandbox id \"e84d962106a7ae6f64c05cdebf49306220d75b27db00cc4d6930b6bf93c784f1\"" Mar 20 18:04:18.146199 containerd[1505]: time="2025-03-20T18:04:18.146120969Z" level=info msg="CreateContainer within sandbox \"e84d962106a7ae6f64c05cdebf49306220d75b27db00cc4d6930b6bf93c784f1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 20 18:04:18.149046 containerd[1505]: time="2025-03-20T18:04:18.149016711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"470503960d228ab6a28d0560fc577b0c90061954deba16adf45d9a2730617540\"" Mar 20 18:04:18.150860 kubelet[2369]: E0320 18:04:18.150831 2369 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.124:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.124:6443: connect: connection refused" interval="1.6s" Mar 20 18:04:18.151695 containerd[1505]: time="2025-03-20T18:04:18.151668536Z" level=info msg="CreateContainer within sandbox \"470503960d228ab6a28d0560fc577b0c90061954deba16adf45d9a2730617540\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 20 18:04:18.157380 containerd[1505]: time="2025-03-20T18:04:18.157238706Z" level=info msg="Container ca564740ed0219a01e73929343e9de6b337fbce02f995d8f983d99363a4c5a7d: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:18.158584 containerd[1505]: time="2025-03-20T18:04:18.158564624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,} returns sandbox id \"dc0fb6b06c47905910ce2e5afd2570e3965f5756a9140109aa1fb82b4a26aab2\"" Mar 20 18:04:18.160771 containerd[1505]: time="2025-03-20T18:04:18.160723404Z" level=info msg="CreateContainer within sandbox \"dc0fb6b06c47905910ce2e5afd2570e3965f5756a9140109aa1fb82b4a26aab2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 20 18:04:18.162395 containerd[1505]: time="2025-03-20T18:04:18.162352080Z" level=info msg="Container 58fbe780e5cc594d02eb2b5fefb866f7629b0aef15c925698aba4ea76976a0b0: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:18.166140 containerd[1505]: time="2025-03-20T18:04:18.166109268Z" level=info msg="CreateContainer within sandbox \"e84d962106a7ae6f64c05cdebf49306220d75b27db00cc4d6930b6bf93c784f1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ca564740ed0219a01e73929343e9de6b337fbce02f995d8f983d99363a4c5a7d\"" Mar 20 18:04:18.166496 containerd[1505]: time="2025-03-20T18:04:18.166476417Z" level=info msg="StartContainer for \"ca564740ed0219a01e73929343e9de6b337fbce02f995d8f983d99363a4c5a7d\"" Mar 20 18:04:18.167452 containerd[1505]: time="2025-03-20T18:04:18.167409568Z" level=info msg="connecting to shim ca564740ed0219a01e73929343e9de6b337fbce02f995d8f983d99363a4c5a7d" address="unix:///run/containerd/s/cf38bb78d0c0b6fdb74928a160dba40b02866f80832befb4a8b9e0853ec86fe9" protocol=ttrpc version=3 Mar 20 18:04:18.171137 containerd[1505]: time="2025-03-20T18:04:18.171113326Z" level=info msg="CreateContainer within sandbox \"470503960d228ab6a28d0560fc577b0c90061954deba16adf45d9a2730617540\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"58fbe780e5cc594d02eb2b5fefb866f7629b0aef15c925698aba4ea76976a0b0\"" Mar 20 18:04:18.171536 containerd[1505]: time="2025-03-20T18:04:18.171511243Z" level=info msg="StartContainer for \"58fbe780e5cc594d02eb2b5fefb866f7629b0aef15c925698aba4ea76976a0b0\"" Mar 20 18:04:18.172598 containerd[1505]: time="2025-03-20T18:04:18.172498806Z" level=info msg="connecting to shim 58fbe780e5cc594d02eb2b5fefb866f7629b0aef15c925698aba4ea76976a0b0" address="unix:///run/containerd/s/9d5b967e8f5d81725dc82f40f9aa0cda4dafd7ac09c85bf324815c0e4436d1d6" protocol=ttrpc version=3 Mar 20 18:04:18.177336 containerd[1505]: time="2025-03-20T18:04:18.177050705Z" level=info msg="Container a9f1662957e6810cd28cc2fab7e426c74ed31f8d664e308fee9fe27f538f6d43: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:18.185352 containerd[1505]: time="2025-03-20T18:04:18.185317094Z" level=info msg="CreateContainer within sandbox \"dc0fb6b06c47905910ce2e5afd2570e3965f5756a9140109aa1fb82b4a26aab2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a9f1662957e6810cd28cc2fab7e426c74ed31f8d664e308fee9fe27f538f6d43\"" Mar 20 18:04:18.185749 containerd[1505]: time="2025-03-20T18:04:18.185709600Z" level=info msg="StartContainer for \"a9f1662957e6810cd28cc2fab7e426c74ed31f8d664e308fee9fe27f538f6d43\"" Mar 20 18:04:18.186732 containerd[1505]: time="2025-03-20T18:04:18.186702453Z" level=info msg="connecting to shim a9f1662957e6810cd28cc2fab7e426c74ed31f8d664e308fee9fe27f538f6d43" address="unix:///run/containerd/s/b4f20d54066430666f0d139ddc31be980a3de93a0e01e7c2b8862dff44db9b17" protocol=ttrpc version=3 Mar 20 18:04:18.189448 systemd[1]: Started cri-containerd-ca564740ed0219a01e73929343e9de6b337fbce02f995d8f983d99363a4c5a7d.scope - libcontainer container ca564740ed0219a01e73929343e9de6b337fbce02f995d8f983d99363a4c5a7d. Mar 20 18:04:18.192760 systemd[1]: Started cri-containerd-58fbe780e5cc594d02eb2b5fefb866f7629b0aef15c925698aba4ea76976a0b0.scope - libcontainer container 58fbe780e5cc594d02eb2b5fefb866f7629b0aef15c925698aba4ea76976a0b0. Mar 20 18:04:18.201221 systemd[1]: Started cri-containerd-a9f1662957e6810cd28cc2fab7e426c74ed31f8d664e308fee9fe27f538f6d43.scope - libcontainer container a9f1662957e6810cd28cc2fab7e426c74ed31f8d664e308fee9fe27f538f6d43. Mar 20 18:04:18.225654 kubelet[2369]: W0320 18:04:18.225581 2369 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.124:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:18.225654 kubelet[2369]: E0320 18:04:18.225659 2369 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.124:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.124:6443: connect: connection refused Mar 20 18:04:18.242707 containerd[1505]: time="2025-03-20T18:04:18.242647630Z" level=info msg="StartContainer for \"ca564740ed0219a01e73929343e9de6b337fbce02f995d8f983d99363a4c5a7d\" returns successfully" Mar 20 18:04:18.248819 containerd[1505]: time="2025-03-20T18:04:18.248748525Z" level=info msg="StartContainer for \"58fbe780e5cc594d02eb2b5fefb866f7629b0aef15c925698aba4ea76976a0b0\" returns successfully" Mar 20 18:04:18.254459 containerd[1505]: time="2025-03-20T18:04:18.254417190Z" level=info msg="StartContainer for \"a9f1662957e6810cd28cc2fab7e426c74ed31f8d664e308fee9fe27f538f6d43\" returns successfully" Mar 20 18:04:18.256323 kubelet[2369]: I0320 18:04:18.254723 2369 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 20 18:04:18.256323 kubelet[2369]: E0320 18:04:18.255002 2369 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.124:6443/api/v1/nodes\": dial tcp 10.0.0.124:6443: connect: connection refused" node="localhost" Mar 20 18:04:19.275277 kubelet[2369]: E0320 18:04:19.275170 2369 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{localhost.182e94ee3a498ce4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-20 18:04:16.73998666 +0000 UTC m=+0.240098363,LastTimestamp:2025-03-20 18:04:16.73998666 +0000 UTC m=+0.240098363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 20 18:04:19.505409 kubelet[2369]: E0320 18:04:19.505354 2369 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Mar 20 18:04:19.737653 kubelet[2369]: I0320 18:04:19.737615 2369 apiserver.go:52] "Watching apiserver" Mar 20 18:04:19.747681 kubelet[2369]: I0320 18:04:19.747656 2369 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 20 18:04:19.754929 kubelet[2369]: E0320 18:04:19.754905 2369 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 20 18:04:19.857417 kubelet[2369]: I0320 18:04:19.857379 2369 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 20 18:04:19.859210 kubelet[2369]: E0320 18:04:19.859181 2369 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Mar 20 18:04:19.861152 kubelet[2369]: I0320 18:04:19.861086 2369 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 20 18:04:21.418151 systemd[1]: Reload requested from client PID 2642 ('systemctl') (unit session-7.scope)... Mar 20 18:04:21.418168 systemd[1]: Reloading... Mar 20 18:04:21.505343 zram_generator::config[2689]: No configuration found. Mar 20 18:04:21.856806 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 18:04:21.971786 systemd[1]: Reloading finished in 553 ms. Mar 20 18:04:21.996694 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 18:04:22.008694 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 18:04:22.008995 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:04:22.009045 systemd[1]: kubelet.service: Consumed 706ms CPU time, 120.4M memory peak. Mar 20 18:04:22.010840 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 18:04:22.205406 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 18:04:22.215042 (kubelet)[2731]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 18:04:22.256717 kubelet[2731]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 18:04:22.256717 kubelet[2731]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 18:04:22.256717 kubelet[2731]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 18:04:22.257090 kubelet[2731]: I0320 18:04:22.256749 2731 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 18:04:22.261041 kubelet[2731]: I0320 18:04:22.261015 2731 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 20 18:04:22.261041 kubelet[2731]: I0320 18:04:22.261033 2731 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 18:04:22.261184 kubelet[2731]: I0320 18:04:22.261170 2731 server.go:927] "Client rotation is on, will bootstrap in background" Mar 20 18:04:22.262353 kubelet[2731]: I0320 18:04:22.262331 2731 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 18:04:22.263353 kubelet[2731]: I0320 18:04:22.263246 2731 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 18:04:22.272165 kubelet[2731]: I0320 18:04:22.272144 2731 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 18:04:22.272410 kubelet[2731]: I0320 18:04:22.272380 2731 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 18:04:22.272552 kubelet[2731]: I0320 18:04:22.272408 2731 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 20 18:04:22.272628 kubelet[2731]: I0320 18:04:22.272574 2731 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 18:04:22.272628 kubelet[2731]: I0320 18:04:22.272584 2731 container_manager_linux.go:301] "Creating device plugin manager" Mar 20 18:04:22.272681 kubelet[2731]: I0320 18:04:22.272637 2731 state_mem.go:36] "Initialized new in-memory state store" Mar 20 18:04:22.272736 kubelet[2731]: I0320 18:04:22.272722 2731 kubelet.go:400] "Attempting to sync node with API server" Mar 20 18:04:22.272736 kubelet[2731]: I0320 18:04:22.272735 2731 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 18:04:22.272798 kubelet[2731]: I0320 18:04:22.272754 2731 kubelet.go:312] "Adding apiserver pod source" Mar 20 18:04:22.272798 kubelet[2731]: I0320 18:04:22.272774 2731 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 18:04:22.273725 kubelet[2731]: I0320 18:04:22.273667 2731 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 18:04:22.273888 kubelet[2731]: I0320 18:04:22.273849 2731 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 18:04:22.274917 kubelet[2731]: I0320 18:04:22.274286 2731 server.go:1264] "Started kubelet" Mar 20 18:04:22.274917 kubelet[2731]: I0320 18:04:22.274587 2731 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 18:04:22.274917 kubelet[2731]: I0320 18:04:22.274578 2731 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 18:04:22.274917 kubelet[2731]: I0320 18:04:22.274839 2731 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 18:04:22.275540 kubelet[2731]: I0320 18:04:22.275518 2731 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 18:04:22.275807 kubelet[2731]: I0320 18:04:22.275791 2731 server.go:455] "Adding debug handlers to kubelet server" Mar 20 18:04:22.280139 kubelet[2731]: E0320 18:04:22.280122 2731 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 18:04:22.282427 kubelet[2731]: E0320 18:04:22.282399 2731 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 20 18:04:22.282465 kubelet[2731]: I0320 18:04:22.282450 2731 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 20 18:04:22.282577 kubelet[2731]: I0320 18:04:22.282553 2731 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 20 18:04:22.282719 kubelet[2731]: I0320 18:04:22.282703 2731 reconciler.go:26] "Reconciler: start to sync state" Mar 20 18:04:22.283555 kubelet[2731]: I0320 18:04:22.283533 2731 factory.go:221] Registration of the systemd container factory successfully Mar 20 18:04:22.283641 kubelet[2731]: I0320 18:04:22.283618 2731 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 18:04:22.285335 kubelet[2731]: I0320 18:04:22.284840 2731 factory.go:221] Registration of the containerd container factory successfully Mar 20 18:04:22.291828 kubelet[2731]: I0320 18:04:22.291190 2731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 18:04:22.292770 kubelet[2731]: I0320 18:04:22.292733 2731 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 18:04:22.292827 kubelet[2731]: I0320 18:04:22.292774 2731 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 18:04:22.292827 kubelet[2731]: I0320 18:04:22.292791 2731 kubelet.go:2337] "Starting kubelet main sync loop" Mar 20 18:04:22.292887 kubelet[2731]: E0320 18:04:22.292846 2731 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 18:04:22.315710 kubelet[2731]: I0320 18:04:22.315676 2731 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 18:04:22.315710 kubelet[2731]: I0320 18:04:22.315697 2731 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 18:04:22.315891 kubelet[2731]: I0320 18:04:22.315716 2731 state_mem.go:36] "Initialized new in-memory state store" Mar 20 18:04:22.315915 kubelet[2731]: I0320 18:04:22.315899 2731 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 20 18:04:22.315941 kubelet[2731]: I0320 18:04:22.315910 2731 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 20 18:04:22.315941 kubelet[2731]: I0320 18:04:22.315931 2731 policy_none.go:49] "None policy: Start" Mar 20 18:04:22.316418 kubelet[2731]: I0320 18:04:22.316377 2731 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 18:04:22.316418 kubelet[2731]: I0320 18:04:22.316401 2731 state_mem.go:35] "Initializing new in-memory state store" Mar 20 18:04:22.316537 kubelet[2731]: I0320 18:04:22.316520 2731 state_mem.go:75] "Updated machine memory state" Mar 20 18:04:22.320804 kubelet[2731]: I0320 18:04:22.320770 2731 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 18:04:22.321016 kubelet[2731]: I0320 18:04:22.320979 2731 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 18:04:22.321191 kubelet[2731]: I0320 18:04:22.321113 2731 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 18:04:22.383903 kubelet[2731]: I0320 18:04:22.383859 2731 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 20 18:04:22.393128 kubelet[2731]: I0320 18:04:22.393086 2731 topology_manager.go:215] "Topology Admit Handler" podUID="6f043f4f1fbe8eb46b6b63f672b28480" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 20 18:04:22.393753 kubelet[2731]: I0320 18:04:22.393165 2731 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 20 18:04:22.393753 kubelet[2731]: I0320 18:04:22.393223 2731 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 20 18:04:22.397769 kubelet[2731]: I0320 18:04:22.397721 2731 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Mar 20 18:04:22.397850 kubelet[2731]: I0320 18:04:22.397817 2731 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 20 18:04:22.401605 kubelet[2731]: E0320 18:04:22.401521 2731 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:22.402116 kubelet[2731]: E0320 18:04:22.402086 2731 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 20 18:04:22.583800 kubelet[2731]: I0320 18:04:22.583733 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 20 18:04:22.583800 kubelet[2731]: I0320 18:04:22.583793 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6f043f4f1fbe8eb46b6b63f672b28480-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"6f043f4f1fbe8eb46b6b63f672b28480\") " pod="kube-system/kube-apiserver-localhost" Mar 20 18:04:22.583985 kubelet[2731]: I0320 18:04:22.583810 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6f043f4f1fbe8eb46b6b63f672b28480-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"6f043f4f1fbe8eb46b6b63f672b28480\") " pod="kube-system/kube-apiserver-localhost" Mar 20 18:04:22.583985 kubelet[2731]: I0320 18:04:22.583825 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:22.583985 kubelet[2731]: I0320 18:04:22.583838 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:22.583985 kubelet[2731]: I0320 18:04:22.583854 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:22.583985 kubelet[2731]: I0320 18:04:22.583868 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:22.584117 kubelet[2731]: I0320 18:04:22.583922 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 20 18:04:22.584117 kubelet[2731]: I0320 18:04:22.583979 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6f043f4f1fbe8eb46b6b63f672b28480-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"6f043f4f1fbe8eb46b6b63f672b28480\") " pod="kube-system/kube-apiserver-localhost" Mar 20 18:04:23.274062 kubelet[2731]: I0320 18:04:23.274022 2731 apiserver.go:52] "Watching apiserver" Mar 20 18:04:23.283296 kubelet[2731]: I0320 18:04:23.283259 2731 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 20 18:04:23.447237 kubelet[2731]: I0320 18:04:23.446853 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=2.446838533 podStartE2EDuration="2.446838533s" podCreationTimestamp="2025-03-20 18:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 18:04:23.437821326 +0000 UTC m=+1.218986777" watchObservedRunningTime="2025-03-20 18:04:23.446838533 +0000 UTC m=+1.228003984" Mar 20 18:04:23.447237 kubelet[2731]: I0320 18:04:23.446957 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.446954351 podStartE2EDuration="3.446954351s" podCreationTimestamp="2025-03-20 18:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 18:04:23.446790583 +0000 UTC m=+1.227956034" watchObservedRunningTime="2025-03-20 18:04:23.446954351 +0000 UTC m=+1.228119802" Mar 20 18:04:23.458812 kubelet[2731]: I0320 18:04:23.458603 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.4585889779999999 podStartE2EDuration="1.458588978s" podCreationTimestamp="2025-03-20 18:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 18:04:23.452523499 +0000 UTC m=+1.233688950" watchObservedRunningTime="2025-03-20 18:04:23.458588978 +0000 UTC m=+1.239754429" Mar 20 18:04:26.510663 sudo[1707]: pam_unix(sudo:session): session closed for user root Mar 20 18:04:26.511977 sshd[1706]: Connection closed by 10.0.0.1 port 36226 Mar 20 18:04:26.512425 sshd-session[1703]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:26.516986 systemd[1]: sshd@6-10.0.0.124:22-10.0.0.1:36226.service: Deactivated successfully. Mar 20 18:04:26.519235 systemd[1]: session-7.scope: Deactivated successfully. Mar 20 18:04:26.519521 systemd[1]: session-7.scope: Consumed 4.002s CPU time, 232.6M memory peak. Mar 20 18:04:26.520794 systemd-logind[1485]: Session 7 logged out. Waiting for processes to exit. Mar 20 18:04:26.521915 systemd-logind[1485]: Removed session 7. Mar 20 18:04:35.589517 kubelet[2731]: I0320 18:04:35.589466 2731 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 20 18:04:35.590403 containerd[1505]: time="2025-03-20T18:04:35.590362955Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 20 18:04:35.591468 kubelet[2731]: I0320 18:04:35.591295 2731 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 20 18:04:36.307017 kubelet[2731]: I0320 18:04:36.306943 2731 topology_manager.go:215] "Topology Admit Handler" podUID="66dcbf2b-4cd7-4375-8271-6ee7073f1997" podNamespace="kube-system" podName="kube-proxy-vlzk6" Mar 20 18:04:36.316772 systemd[1]: Created slice kubepods-besteffort-pod66dcbf2b_4cd7_4375_8271_6ee7073f1997.slice - libcontainer container kubepods-besteffort-pod66dcbf2b_4cd7_4375_8271_6ee7073f1997.slice. Mar 20 18:04:36.467972 kubelet[2731]: I0320 18:04:36.467913 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/66dcbf2b-4cd7-4375-8271-6ee7073f1997-kube-proxy\") pod \"kube-proxy-vlzk6\" (UID: \"66dcbf2b-4cd7-4375-8271-6ee7073f1997\") " pod="kube-system/kube-proxy-vlzk6" Mar 20 18:04:36.467972 kubelet[2731]: I0320 18:04:36.467957 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckghb\" (UniqueName: \"kubernetes.io/projected/66dcbf2b-4cd7-4375-8271-6ee7073f1997-kube-api-access-ckghb\") pod \"kube-proxy-vlzk6\" (UID: \"66dcbf2b-4cd7-4375-8271-6ee7073f1997\") " pod="kube-system/kube-proxy-vlzk6" Mar 20 18:04:36.467972 kubelet[2731]: I0320 18:04:36.467979 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/66dcbf2b-4cd7-4375-8271-6ee7073f1997-xtables-lock\") pod \"kube-proxy-vlzk6\" (UID: \"66dcbf2b-4cd7-4375-8271-6ee7073f1997\") " pod="kube-system/kube-proxy-vlzk6" Mar 20 18:04:36.468173 kubelet[2731]: I0320 18:04:36.468015 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66dcbf2b-4cd7-4375-8271-6ee7073f1997-lib-modules\") pod \"kube-proxy-vlzk6\" (UID: \"66dcbf2b-4cd7-4375-8271-6ee7073f1997\") " pod="kube-system/kube-proxy-vlzk6" Mar 20 18:04:36.629962 kubelet[2731]: I0320 18:04:36.629248 2731 topology_manager.go:215] "Topology Admit Handler" podUID="4666ef54-1f85-4469-925a-4fdd3b3ccb87" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-7v56m" Mar 20 18:04:36.635337 containerd[1505]: time="2025-03-20T18:04:36.635264100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vlzk6,Uid:66dcbf2b-4cd7-4375-8271-6ee7073f1997,Namespace:kube-system,Attempt:0,}" Mar 20 18:04:36.640796 systemd[1]: Created slice kubepods-besteffort-pod4666ef54_1f85_4469_925a_4fdd3b3ccb87.slice - libcontainer container kubepods-besteffort-pod4666ef54_1f85_4469_925a_4fdd3b3ccb87.slice. Mar 20 18:04:36.683732 containerd[1505]: time="2025-03-20T18:04:36.683677702Z" level=info msg="connecting to shim 15040715bbc060109b39332e3ae607e6fa6a8eef22dbf05dc56d48b84aebd6e0" address="unix:///run/containerd/s/6bea1e37b8e02ef69e3d34b8af67a942a7de936e65cf83aa1c06534e0e886d62" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:04:36.741522 systemd[1]: Started cri-containerd-15040715bbc060109b39332e3ae607e6fa6a8eef22dbf05dc56d48b84aebd6e0.scope - libcontainer container 15040715bbc060109b39332e3ae607e6fa6a8eef22dbf05dc56d48b84aebd6e0. Mar 20 18:04:36.767560 containerd[1505]: time="2025-03-20T18:04:36.767500261Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vlzk6,Uid:66dcbf2b-4cd7-4375-8271-6ee7073f1997,Namespace:kube-system,Attempt:0,} returns sandbox id \"15040715bbc060109b39332e3ae607e6fa6a8eef22dbf05dc56d48b84aebd6e0\"" Mar 20 18:04:36.769883 kubelet[2731]: I0320 18:04:36.769829 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4666ef54-1f85-4469-925a-4fdd3b3ccb87-var-lib-calico\") pod \"tigera-operator-6479d6dc54-7v56m\" (UID: \"4666ef54-1f85-4469-925a-4fdd3b3ccb87\") " pod="tigera-operator/tigera-operator-6479d6dc54-7v56m" Mar 20 18:04:36.769883 kubelet[2731]: I0320 18:04:36.769880 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llxf5\" (UniqueName: \"kubernetes.io/projected/4666ef54-1f85-4469-925a-4fdd3b3ccb87-kube-api-access-llxf5\") pod \"tigera-operator-6479d6dc54-7v56m\" (UID: \"4666ef54-1f85-4469-925a-4fdd3b3ccb87\") " pod="tigera-operator/tigera-operator-6479d6dc54-7v56m" Mar 20 18:04:36.770864 containerd[1505]: time="2025-03-20T18:04:36.770721671Z" level=info msg="CreateContainer within sandbox \"15040715bbc060109b39332e3ae607e6fa6a8eef22dbf05dc56d48b84aebd6e0\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 20 18:04:36.783217 containerd[1505]: time="2025-03-20T18:04:36.783153628Z" level=info msg="Container fb78508fd0e63b6519fe51702888ee1530e1022ae0c6f4571ab2bbe688e201ee: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:36.787506 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3642541336.mount: Deactivated successfully. Mar 20 18:04:36.794068 containerd[1505]: time="2025-03-20T18:04:36.794019721Z" level=info msg="CreateContainer within sandbox \"15040715bbc060109b39332e3ae607e6fa6a8eef22dbf05dc56d48b84aebd6e0\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"fb78508fd0e63b6519fe51702888ee1530e1022ae0c6f4571ab2bbe688e201ee\"" Mar 20 18:04:36.794709 containerd[1505]: time="2025-03-20T18:04:36.794651982Z" level=info msg="StartContainer for \"fb78508fd0e63b6519fe51702888ee1530e1022ae0c6f4571ab2bbe688e201ee\"" Mar 20 18:04:36.796115 containerd[1505]: time="2025-03-20T18:04:36.796090615Z" level=info msg="connecting to shim fb78508fd0e63b6519fe51702888ee1530e1022ae0c6f4571ab2bbe688e201ee" address="unix:///run/containerd/s/6bea1e37b8e02ef69e3d34b8af67a942a7de936e65cf83aa1c06534e0e886d62" protocol=ttrpc version=3 Mar 20 18:04:36.818528 systemd[1]: Started cri-containerd-fb78508fd0e63b6519fe51702888ee1530e1022ae0c6f4571ab2bbe688e201ee.scope - libcontainer container fb78508fd0e63b6519fe51702888ee1530e1022ae0c6f4571ab2bbe688e201ee. Mar 20 18:04:36.862974 containerd[1505]: time="2025-03-20T18:04:36.862240711Z" level=info msg="StartContainer for \"fb78508fd0e63b6519fe51702888ee1530e1022ae0c6f4571ab2bbe688e201ee\" returns successfully" Mar 20 18:04:36.944574 containerd[1505]: time="2025-03-20T18:04:36.944447791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-7v56m,Uid:4666ef54-1f85-4469-925a-4fdd3b3ccb87,Namespace:tigera-operator,Attempt:0,}" Mar 20 18:04:36.965802 containerd[1505]: time="2025-03-20T18:04:36.965737837Z" level=info msg="connecting to shim 216946c878ccd6ecb82d1cd90688103aeebd317c6ca6e8011c2d285dcaf71d1f" address="unix:///run/containerd/s/2d50a6ae298420aca58f242b02967f78ded5ab21a854edcfebf78fdbea204f60" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:04:37.000544 systemd[1]: Started cri-containerd-216946c878ccd6ecb82d1cd90688103aeebd317c6ca6e8011c2d285dcaf71d1f.scope - libcontainer container 216946c878ccd6ecb82d1cd90688103aeebd317c6ca6e8011c2d285dcaf71d1f. Mar 20 18:04:37.045399 containerd[1505]: time="2025-03-20T18:04:37.045355282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-7v56m,Uid:4666ef54-1f85-4469-925a-4fdd3b3ccb87,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"216946c878ccd6ecb82d1cd90688103aeebd317c6ca6e8011c2d285dcaf71d1f\"" Mar 20 18:04:37.047055 containerd[1505]: time="2025-03-20T18:04:37.047028188Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 20 18:04:37.336864 kubelet[2731]: I0320 18:04:37.336795 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vlzk6" podStartSLOduration=1.336776349 podStartE2EDuration="1.336776349s" podCreationTimestamp="2025-03-20 18:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 18:04:37.336727045 +0000 UTC m=+15.117892496" watchObservedRunningTime="2025-03-20 18:04:37.336776349 +0000 UTC m=+15.117941800" Mar 20 18:04:38.364714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3791402927.mount: Deactivated successfully. Mar 20 18:04:38.660441 containerd[1505]: time="2025-03-20T18:04:38.660321634Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:38.661165 containerd[1505]: time="2025-03-20T18:04:38.661116731Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 20 18:04:38.662259 containerd[1505]: time="2025-03-20T18:04:38.662227820Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:38.664013 containerd[1505]: time="2025-03-20T18:04:38.663982167Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:38.664629 containerd[1505]: time="2025-03-20T18:04:38.664588668Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 1.617524953s" Mar 20 18:04:38.664681 containerd[1505]: time="2025-03-20T18:04:38.664627021Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 20 18:04:38.666099 containerd[1505]: time="2025-03-20T18:04:38.666068595Z" level=info msg="CreateContainer within sandbox \"216946c878ccd6ecb82d1cd90688103aeebd317c6ca6e8011c2d285dcaf71d1f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 20 18:04:38.674114 containerd[1505]: time="2025-03-20T18:04:38.674075522Z" level=info msg="Container 8717141f23b63a3f3844522465cc870a3423a39ff66c2c67af57bae437b9714d: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:38.680312 containerd[1505]: time="2025-03-20T18:04:38.680268880Z" level=info msg="CreateContainer within sandbox \"216946c878ccd6ecb82d1cd90688103aeebd317c6ca6e8011c2d285dcaf71d1f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8717141f23b63a3f3844522465cc870a3423a39ff66c2c67af57bae437b9714d\"" Mar 20 18:04:38.680753 containerd[1505]: time="2025-03-20T18:04:38.680713614Z" level=info msg="StartContainer for \"8717141f23b63a3f3844522465cc870a3423a39ff66c2c67af57bae437b9714d\"" Mar 20 18:04:38.681492 containerd[1505]: time="2025-03-20T18:04:38.681465250Z" level=info msg="connecting to shim 8717141f23b63a3f3844522465cc870a3423a39ff66c2c67af57bae437b9714d" address="unix:///run/containerd/s/2d50a6ae298420aca58f242b02967f78ded5ab21a854edcfebf78fdbea204f60" protocol=ttrpc version=3 Mar 20 18:04:38.705481 systemd[1]: Started cri-containerd-8717141f23b63a3f3844522465cc870a3423a39ff66c2c67af57bae437b9714d.scope - libcontainer container 8717141f23b63a3f3844522465cc870a3423a39ff66c2c67af57bae437b9714d. Mar 20 18:04:38.734640 containerd[1505]: time="2025-03-20T18:04:38.734603877Z" level=info msg="StartContainer for \"8717141f23b63a3f3844522465cc870a3423a39ff66c2c67af57bae437b9714d\" returns successfully" Mar 20 18:04:39.340090 kubelet[2731]: I0320 18:04:39.340018 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-7v56m" podStartSLOduration=1.721439008 podStartE2EDuration="3.340000256s" podCreationTimestamp="2025-03-20 18:04:36 +0000 UTC" firstStartedPulling="2025-03-20 18:04:37.046531896 +0000 UTC m=+14.827697347" lastFinishedPulling="2025-03-20 18:04:38.665093144 +0000 UTC m=+16.446258595" observedRunningTime="2025-03-20 18:04:39.339661874 +0000 UTC m=+17.120827325" watchObservedRunningTime="2025-03-20 18:04:39.340000256 +0000 UTC m=+17.121165707" Mar 20 18:04:39.927631 update_engine[1486]: I20250320 18:04:39.927551 1486 update_attempter.cc:509] Updating boot flags... Mar 20 18:04:39.953337 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (3024) Mar 20 18:04:39.986565 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (3121) Mar 20 18:04:40.031410 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (3121) Mar 20 18:04:41.861006 kubelet[2731]: I0320 18:04:41.860963 2731 topology_manager.go:215] "Topology Admit Handler" podUID="55f33bc9-a33c-4b47-93d7-14a0b8511279" podNamespace="calico-system" podName="calico-typha-64c5cd9bb6-n7pnb" Mar 20 18:04:41.869998 systemd[1]: Created slice kubepods-besteffort-pod55f33bc9_a33c_4b47_93d7_14a0b8511279.slice - libcontainer container kubepods-besteffort-pod55f33bc9_a33c_4b47_93d7_14a0b8511279.slice. Mar 20 18:04:41.900720 kubelet[2731]: I0320 18:04:41.900232 2731 topology_manager.go:215] "Topology Admit Handler" podUID="9586c919-8ef8-469e-a15f-e8b454080a40" podNamespace="calico-system" podName="calico-node-hkdwt" Mar 20 18:04:41.909834 systemd[1]: Created slice kubepods-besteffort-pod9586c919_8ef8_469e_a15f_e8b454080a40.slice - libcontainer container kubepods-besteffort-pod9586c919_8ef8_469e_a15f_e8b454080a40.slice. Mar 20 18:04:42.002749 kubelet[2731]: I0320 18:04:42.002695 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55f33bc9-a33c-4b47-93d7-14a0b8511279-tigera-ca-bundle\") pod \"calico-typha-64c5cd9bb6-n7pnb\" (UID: \"55f33bc9-a33c-4b47-93d7-14a0b8511279\") " pod="calico-system/calico-typha-64c5cd9bb6-n7pnb" Mar 20 18:04:42.002749 kubelet[2731]: I0320 18:04:42.002746 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9586c919-8ef8-469e-a15f-e8b454080a40-tigera-ca-bundle\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.002749 kubelet[2731]: I0320 18:04:42.002764 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-var-run-calico\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.002749 kubelet[2731]: I0320 18:04:42.002778 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-cni-bin-dir\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.002749 kubelet[2731]: I0320 18:04:42.002793 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-lib-modules\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003037 kubelet[2731]: I0320 18:04:42.002806 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vwfk\" (UniqueName: \"kubernetes.io/projected/9586c919-8ef8-469e-a15f-e8b454080a40-kube-api-access-2vwfk\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003037 kubelet[2731]: I0320 18:04:42.002821 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-856mc\" (UniqueName: \"kubernetes.io/projected/55f33bc9-a33c-4b47-93d7-14a0b8511279-kube-api-access-856mc\") pod \"calico-typha-64c5cd9bb6-n7pnb\" (UID: \"55f33bc9-a33c-4b47-93d7-14a0b8511279\") " pod="calico-system/calico-typha-64c5cd9bb6-n7pnb" Mar 20 18:04:42.003037 kubelet[2731]: I0320 18:04:42.002835 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-policysync\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003037 kubelet[2731]: I0320 18:04:42.002848 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9586c919-8ef8-469e-a15f-e8b454080a40-node-certs\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003037 kubelet[2731]: I0320 18:04:42.002862 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-cni-net-dir\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003152 kubelet[2731]: I0320 18:04:42.002876 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/55f33bc9-a33c-4b47-93d7-14a0b8511279-typha-certs\") pod \"calico-typha-64c5cd9bb6-n7pnb\" (UID: \"55f33bc9-a33c-4b47-93d7-14a0b8511279\") " pod="calico-system/calico-typha-64c5cd9bb6-n7pnb" Mar 20 18:04:42.003152 kubelet[2731]: I0320 18:04:42.002891 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-cni-log-dir\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003152 kubelet[2731]: I0320 18:04:42.002903 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-xtables-lock\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003152 kubelet[2731]: I0320 18:04:42.002916 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-var-lib-calico\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.003152 kubelet[2731]: I0320 18:04:42.002930 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9586c919-8ef8-469e-a15f-e8b454080a40-flexvol-driver-host\") pod \"calico-node-hkdwt\" (UID: \"9586c919-8ef8-469e-a15f-e8b454080a40\") " pod="calico-system/calico-node-hkdwt" Mar 20 18:04:42.007289 kubelet[2731]: I0320 18:04:42.007255 2731 topology_manager.go:215] "Topology Admit Handler" podUID="677f2c34-5e47-440b-986e-f493c61c5494" podNamespace="calico-system" podName="csi-node-driver-qn6tp" Mar 20 18:04:42.007616 kubelet[2731]: E0320 18:04:42.007535 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qn6tp" podUID="677f2c34-5e47-440b-986e-f493c61c5494" Mar 20 18:04:42.106076 kubelet[2731]: E0320 18:04:42.106002 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.106076 kubelet[2731]: W0320 18:04:42.106032 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.106076 kubelet[2731]: E0320 18:04:42.106053 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.108047 kubelet[2731]: E0320 18:04:42.106320 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.108047 kubelet[2731]: W0320 18:04:42.106327 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.108047 kubelet[2731]: E0320 18:04:42.106385 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.108047 kubelet[2731]: E0320 18:04:42.106550 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.108047 kubelet[2731]: W0320 18:04:42.106558 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.108047 kubelet[2731]: E0320 18:04:42.106588 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.108047 kubelet[2731]: E0320 18:04:42.106779 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.108047 kubelet[2731]: W0320 18:04:42.106786 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.108047 kubelet[2731]: E0320 18:04:42.106857 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.108563 kubelet[2731]: E0320 18:04:42.108551 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.108736 kubelet[2731]: W0320 18:04:42.108583 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.108736 kubelet[2731]: E0320 18:04:42.108599 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.109109 kubelet[2731]: E0320 18:04:42.109059 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.109109 kubelet[2731]: W0320 18:04:42.109069 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.109909 kubelet[2731]: E0320 18:04:42.109189 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.110185 kubelet[2731]: E0320 18:04:42.110173 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.110251 kubelet[2731]: W0320 18:04:42.110239 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.110413 kubelet[2731]: E0320 18:04:42.110363 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.110724 kubelet[2731]: E0320 18:04:42.110712 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.110809 kubelet[2731]: W0320 18:04:42.110797 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.110916 kubelet[2731]: E0320 18:04:42.110853 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.111206 kubelet[2731]: E0320 18:04:42.111100 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.112138 kubelet[2731]: W0320 18:04:42.112015 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.112138 kubelet[2731]: E0320 18:04:42.112029 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.112423 kubelet[2731]: E0320 18:04:42.112411 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.112738 kubelet[2731]: W0320 18:04:42.112492 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.112738 kubelet[2731]: E0320 18:04:42.112505 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.113955 kubelet[2731]: E0320 18:04:42.113909 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.113955 kubelet[2731]: W0320 18:04:42.113951 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.114318 kubelet[2731]: E0320 18:04:42.113970 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.117277 kubelet[2731]: E0320 18:04:42.117256 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.117277 kubelet[2731]: W0320 18:04:42.117269 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.117576 kubelet[2731]: E0320 18:04:42.117294 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.117576 kubelet[2731]: E0320 18:04:42.117527 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.117576 kubelet[2731]: W0320 18:04:42.117538 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.117576 kubelet[2731]: E0320 18:04:42.117548 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.174758 containerd[1505]: time="2025-03-20T18:04:42.174713009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64c5cd9bb6-n7pnb,Uid:55f33bc9-a33c-4b47-93d7-14a0b8511279,Namespace:calico-system,Attempt:0,}" Mar 20 18:04:42.192572 containerd[1505]: time="2025-03-20T18:04:42.192535210Z" level=info msg="connecting to shim 9c5ca96281e408f2b3d1e9eaac9807eae01a36e7ebae87cf2a5d89881019098e" address="unix:///run/containerd/s/67ab658aa5b112c795847b9c620f2eba801c519c3807109b748088e41bcf7960" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:04:42.204277 kubelet[2731]: E0320 18:04:42.204226 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.204277 kubelet[2731]: W0320 18:04:42.204244 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.204277 kubelet[2731]: E0320 18:04:42.204262 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.204493 kubelet[2731]: I0320 18:04:42.204290 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/677f2c34-5e47-440b-986e-f493c61c5494-socket-dir\") pod \"csi-node-driver-qn6tp\" (UID: \"677f2c34-5e47-440b-986e-f493c61c5494\") " pod="calico-system/csi-node-driver-qn6tp" Mar 20 18:04:42.204590 kubelet[2731]: E0320 18:04:42.204574 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.204590 kubelet[2731]: W0320 18:04:42.204586 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.204650 kubelet[2731]: E0320 18:04:42.204600 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.204650 kubelet[2731]: I0320 18:04:42.204614 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-656d2\" (UniqueName: \"kubernetes.io/projected/677f2c34-5e47-440b-986e-f493c61c5494-kube-api-access-656d2\") pod \"csi-node-driver-qn6tp\" (UID: \"677f2c34-5e47-440b-986e-f493c61c5494\") " pod="calico-system/csi-node-driver-qn6tp" Mar 20 18:04:42.204845 kubelet[2731]: E0320 18:04:42.204821 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.204845 kubelet[2731]: W0320 18:04:42.204832 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.204845 kubelet[2731]: E0320 18:04:42.204846 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.204919 kubelet[2731]: I0320 18:04:42.204860 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/677f2c34-5e47-440b-986e-f493c61c5494-kubelet-dir\") pod \"csi-node-driver-qn6tp\" (UID: \"677f2c34-5e47-440b-986e-f493c61c5494\") " pod="calico-system/csi-node-driver-qn6tp" Mar 20 18:04:42.205082 kubelet[2731]: E0320 18:04:42.205068 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.205082 kubelet[2731]: W0320 18:04:42.205079 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.205152 kubelet[2731]: E0320 18:04:42.205092 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.205152 kubelet[2731]: I0320 18:04:42.205106 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/677f2c34-5e47-440b-986e-f493c61c5494-varrun\") pod \"csi-node-driver-qn6tp\" (UID: \"677f2c34-5e47-440b-986e-f493c61c5494\") " pod="calico-system/csi-node-driver-qn6tp" Mar 20 18:04:42.205381 kubelet[2731]: E0320 18:04:42.205360 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.205422 kubelet[2731]: W0320 18:04:42.205380 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.205449 kubelet[2731]: E0320 18:04:42.205417 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.205639 kubelet[2731]: E0320 18:04:42.205627 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.205639 kubelet[2731]: W0320 18:04:42.205637 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.205693 kubelet[2731]: E0320 18:04:42.205650 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.205873 kubelet[2731]: E0320 18:04:42.205861 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.205873 kubelet[2731]: W0320 18:04:42.205870 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.205937 kubelet[2731]: E0320 18:04:42.205881 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.206080 kubelet[2731]: E0320 18:04:42.206068 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.206080 kubelet[2731]: W0320 18:04:42.206077 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.206152 kubelet[2731]: E0320 18:04:42.206093 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.206326 kubelet[2731]: E0320 18:04:42.206310 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.206326 kubelet[2731]: W0320 18:04:42.206323 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.206383 kubelet[2731]: E0320 18:04:42.206337 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.207017 kubelet[2731]: E0320 18:04:42.206612 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.207017 kubelet[2731]: W0320 18:04:42.206624 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.207017 kubelet[2731]: E0320 18:04:42.206673 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.207017 kubelet[2731]: E0320 18:04:42.206864 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.207017 kubelet[2731]: W0320 18:04:42.206871 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.207017 kubelet[2731]: E0320 18:04:42.206920 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.207017 kubelet[2731]: I0320 18:04:42.206936 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/677f2c34-5e47-440b-986e-f493c61c5494-registration-dir\") pod \"csi-node-driver-qn6tp\" (UID: \"677f2c34-5e47-440b-986e-f493c61c5494\") " pod="calico-system/csi-node-driver-qn6tp" Mar 20 18:04:42.207465 kubelet[2731]: E0320 18:04:42.207293 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.207465 kubelet[2731]: W0320 18:04:42.207338 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.207465 kubelet[2731]: E0320 18:04:42.207360 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.207621 kubelet[2731]: E0320 18:04:42.207611 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.207779 kubelet[2731]: W0320 18:04:42.207662 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.207779 kubelet[2731]: E0320 18:04:42.207686 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.207919 kubelet[2731]: E0320 18:04:42.207908 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.207973 kubelet[2731]: W0320 18:04:42.207963 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.208045 kubelet[2731]: E0320 18:04:42.208008 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.208295 kubelet[2731]: E0320 18:04:42.208260 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.208295 kubelet[2731]: W0320 18:04:42.208270 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.208295 kubelet[2731]: E0320 18:04:42.208278 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.212723 containerd[1505]: time="2025-03-20T18:04:42.212689414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hkdwt,Uid:9586c919-8ef8-469e-a15f-e8b454080a40,Namespace:calico-system,Attempt:0,}" Mar 20 18:04:42.218452 systemd[1]: Started cri-containerd-9c5ca96281e408f2b3d1e9eaac9807eae01a36e7ebae87cf2a5d89881019098e.scope - libcontainer container 9c5ca96281e408f2b3d1e9eaac9807eae01a36e7ebae87cf2a5d89881019098e. Mar 20 18:04:42.232202 containerd[1505]: time="2025-03-20T18:04:42.232166789Z" level=info msg="connecting to shim 26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4" address="unix:///run/containerd/s/067aa699b6852fe72d2ec2aa6cbe95bb25744097510f8df2beb0c5055be07e6a" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:04:42.256998 systemd[1]: Started cri-containerd-26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4.scope - libcontainer container 26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4. Mar 20 18:04:42.307331 kubelet[2731]: E0320 18:04:42.307295 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.307331 kubelet[2731]: W0320 18:04:42.307324 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.307651 kubelet[2731]: E0320 18:04:42.307340 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.307651 kubelet[2731]: E0320 18:04:42.307602 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.307651 kubelet[2731]: W0320 18:04:42.307609 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.307651 kubelet[2731]: E0320 18:04:42.307623 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.307942 kubelet[2731]: E0320 18:04:42.307918 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.307975 kubelet[2731]: W0320 18:04:42.307946 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.308001 kubelet[2731]: E0320 18:04:42.307974 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.308212 kubelet[2731]: E0320 18:04:42.308197 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.308212 kubelet[2731]: W0320 18:04:42.308208 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.308270 kubelet[2731]: E0320 18:04:42.308222 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.308514 kubelet[2731]: E0320 18:04:42.308489 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.308551 kubelet[2731]: W0320 18:04:42.308512 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.308551 kubelet[2731]: E0320 18:04:42.308536 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.308755 kubelet[2731]: E0320 18:04:42.308741 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.308787 kubelet[2731]: W0320 18:04:42.308761 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.308787 kubelet[2731]: E0320 18:04:42.308775 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.308985 kubelet[2731]: E0320 18:04:42.308971 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.308985 kubelet[2731]: W0320 18:04:42.308982 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.309039 kubelet[2731]: E0320 18:04:42.308998 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.309192 kubelet[2731]: E0320 18:04:42.309177 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.309192 kubelet[2731]: W0320 18:04:42.309188 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.309262 kubelet[2731]: E0320 18:04:42.309203 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.309423 kubelet[2731]: E0320 18:04:42.309412 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.309423 kubelet[2731]: W0320 18:04:42.309421 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.309497 kubelet[2731]: E0320 18:04:42.309446 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.309621 kubelet[2731]: E0320 18:04:42.309607 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.309621 kubelet[2731]: W0320 18:04:42.309618 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.309685 kubelet[2731]: E0320 18:04:42.309654 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.309872 kubelet[2731]: E0320 18:04:42.309851 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.309872 kubelet[2731]: W0320 18:04:42.309861 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.310005 kubelet[2731]: E0320 18:04:42.309875 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.310389 kubelet[2731]: E0320 18:04:42.310139 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.310389 kubelet[2731]: W0320 18:04:42.310154 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.310389 kubelet[2731]: E0320 18:04:42.310172 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.310389 kubelet[2731]: E0320 18:04:42.310373 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.310389 kubelet[2731]: W0320 18:04:42.310380 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.310623 kubelet[2731]: E0320 18:04:42.310491 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.310650 kubelet[2731]: E0320 18:04:42.310639 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.310650 kubelet[2731]: W0320 18:04:42.310647 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.310695 kubelet[2731]: E0320 18:04:42.310668 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.310856 kubelet[2731]: E0320 18:04:42.310841 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.310856 kubelet[2731]: W0320 18:04:42.310851 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.310916 kubelet[2731]: E0320 18:04:42.310862 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.311079 kubelet[2731]: E0320 18:04:42.311057 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.311079 kubelet[2731]: W0320 18:04:42.311071 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.311079 kubelet[2731]: E0320 18:04:42.311085 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.311338 kubelet[2731]: E0320 18:04:42.311319 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.311338 kubelet[2731]: W0320 18:04:42.311333 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.311420 kubelet[2731]: E0320 18:04:42.311349 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.311583 kubelet[2731]: E0320 18:04:42.311568 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.311583 kubelet[2731]: W0320 18:04:42.311580 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.311651 kubelet[2731]: E0320 18:04:42.311593 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.311797 kubelet[2731]: E0320 18:04:42.311781 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.311797 kubelet[2731]: W0320 18:04:42.311794 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.311879 kubelet[2731]: E0320 18:04:42.311807 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.312178 kubelet[2731]: E0320 18:04:42.312055 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.312178 kubelet[2731]: W0320 18:04:42.312078 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.312178 kubelet[2731]: E0320 18:04:42.312119 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.312391 kubelet[2731]: E0320 18:04:42.312380 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.312527 kubelet[2731]: W0320 18:04:42.312481 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.312603 kubelet[2731]: E0320 18:04:42.312590 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.312868 kubelet[2731]: E0320 18:04:42.312851 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.312868 kubelet[2731]: W0320 18:04:42.312865 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.312969 kubelet[2731]: E0320 18:04:42.312880 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.313039 containerd[1505]: time="2025-03-20T18:04:42.313006137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64c5cd9bb6-n7pnb,Uid:55f33bc9-a33c-4b47-93d7-14a0b8511279,Namespace:calico-system,Attempt:0,} returns sandbox id \"9c5ca96281e408f2b3d1e9eaac9807eae01a36e7ebae87cf2a5d89881019098e\"" Mar 20 18:04:42.313125 kubelet[2731]: E0320 18:04:42.313060 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.313125 kubelet[2731]: W0320 18:04:42.313069 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.313125 kubelet[2731]: E0320 18:04:42.313077 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.313569 kubelet[2731]: E0320 18:04:42.313538 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.313569 kubelet[2731]: W0320 18:04:42.313549 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.313569 kubelet[2731]: E0320 18:04:42.313558 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.313787 kubelet[2731]: E0320 18:04:42.313773 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.313787 kubelet[2731]: W0320 18:04:42.313783 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.313787 kubelet[2731]: E0320 18:04:42.313793 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:42.314889 containerd[1505]: time="2025-03-20T18:04:42.314569535Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 20 18:04:42.314889 containerd[1505]: time="2025-03-20T18:04:42.314784071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hkdwt,Uid:9586c919-8ef8-469e-a15f-e8b454080a40,Namespace:calico-system,Attempt:0,} returns sandbox id \"26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4\"" Mar 20 18:04:42.319021 kubelet[2731]: E0320 18:04:42.319004 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:42.319021 kubelet[2731]: W0320 18:04:42.319016 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:42.319102 kubelet[2731]: E0320 18:04:42.319029 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:43.294030 kubelet[2731]: E0320 18:04:43.293982 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qn6tp" podUID="677f2c34-5e47-440b-986e-f493c61c5494" Mar 20 18:04:44.056814 containerd[1505]: time="2025-03-20T18:04:44.056754897Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:44.057697 containerd[1505]: time="2025-03-20T18:04:44.057658836Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 20 18:04:44.058820 containerd[1505]: time="2025-03-20T18:04:44.058788401Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:44.060689 containerd[1505]: time="2025-03-20T18:04:44.060656983Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:44.061179 containerd[1505]: time="2025-03-20T18:04:44.061139475Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 1.746543991s" Mar 20 18:04:44.061179 containerd[1505]: time="2025-03-20T18:04:44.061177848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 20 18:04:44.062035 containerd[1505]: time="2025-03-20T18:04:44.062015200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 20 18:04:44.070532 containerd[1505]: time="2025-03-20T18:04:44.070484659Z" level=info msg="CreateContainer within sandbox \"9c5ca96281e408f2b3d1e9eaac9807eae01a36e7ebae87cf2a5d89881019098e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 18:04:44.080755 containerd[1505]: time="2025-03-20T18:04:44.079967764Z" level=info msg="Container 291c68b76d9814c9a9f9aa4c07b813b36eff445a03b62bb43334dcd063d26b65: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:44.087472 containerd[1505]: time="2025-03-20T18:04:44.087433696Z" level=info msg="CreateContainer within sandbox \"9c5ca96281e408f2b3d1e9eaac9807eae01a36e7ebae87cf2a5d89881019098e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"291c68b76d9814c9a9f9aa4c07b813b36eff445a03b62bb43334dcd063d26b65\"" Mar 20 18:04:44.087950 containerd[1505]: time="2025-03-20T18:04:44.087877615Z" level=info msg="StartContainer for \"291c68b76d9814c9a9f9aa4c07b813b36eff445a03b62bb43334dcd063d26b65\"" Mar 20 18:04:44.089278 containerd[1505]: time="2025-03-20T18:04:44.089156252Z" level=info msg="connecting to shim 291c68b76d9814c9a9f9aa4c07b813b36eff445a03b62bb43334dcd063d26b65" address="unix:///run/containerd/s/67ab658aa5b112c795847b9c620f2eba801c519c3807109b748088e41bcf7960" protocol=ttrpc version=3 Mar 20 18:04:44.110426 systemd[1]: Started cri-containerd-291c68b76d9814c9a9f9aa4c07b813b36eff445a03b62bb43334dcd063d26b65.scope - libcontainer container 291c68b76d9814c9a9f9aa4c07b813b36eff445a03b62bb43334dcd063d26b65. Mar 20 18:04:44.155491 containerd[1505]: time="2025-03-20T18:04:44.155435993Z" level=info msg="StartContainer for \"291c68b76d9814c9a9f9aa4c07b813b36eff445a03b62bb43334dcd063d26b65\" returns successfully" Mar 20 18:04:44.422615 kubelet[2731]: E0320 18:04:44.422498 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.422615 kubelet[2731]: W0320 18:04:44.422527 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.422615 kubelet[2731]: E0320 18:04:44.422549 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.423121 kubelet[2731]: E0320 18:04:44.422786 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.423121 kubelet[2731]: W0320 18:04:44.422796 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.423121 kubelet[2731]: E0320 18:04:44.422807 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.423121 kubelet[2731]: E0320 18:04:44.423089 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.423121 kubelet[2731]: W0320 18:04:44.423105 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.423121 kubelet[2731]: E0320 18:04:44.423125 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.423399 kubelet[2731]: E0320 18:04:44.423380 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.423399 kubelet[2731]: W0320 18:04:44.423390 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.423399 kubelet[2731]: E0320 18:04:44.423398 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.423631 kubelet[2731]: E0320 18:04:44.423616 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.423631 kubelet[2731]: W0320 18:04:44.423625 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.423691 kubelet[2731]: E0320 18:04:44.423632 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.423838 kubelet[2731]: E0320 18:04:44.423823 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.423838 kubelet[2731]: W0320 18:04:44.423832 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.423884 kubelet[2731]: E0320 18:04:44.423839 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.424034 kubelet[2731]: E0320 18:04:44.424024 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.424060 kubelet[2731]: W0320 18:04:44.424033 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.424060 kubelet[2731]: E0320 18:04:44.424040 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.424262 kubelet[2731]: E0320 18:04:44.424243 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.424262 kubelet[2731]: W0320 18:04:44.424260 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.424330 kubelet[2731]: E0320 18:04:44.424269 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.424503 kubelet[2731]: E0320 18:04:44.424492 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.424503 kubelet[2731]: W0320 18:04:44.424501 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.424556 kubelet[2731]: E0320 18:04:44.424508 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.424752 kubelet[2731]: E0320 18:04:44.424732 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.424752 kubelet[2731]: W0320 18:04:44.424751 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.424818 kubelet[2731]: E0320 18:04:44.424764 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.425001 kubelet[2731]: E0320 18:04:44.424986 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.425028 kubelet[2731]: W0320 18:04:44.425001 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.425028 kubelet[2731]: E0320 18:04:44.425012 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.425268 kubelet[2731]: E0320 18:04:44.425242 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.425268 kubelet[2731]: W0320 18:04:44.425254 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.425268 kubelet[2731]: E0320 18:04:44.425264 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.425535 kubelet[2731]: E0320 18:04:44.425519 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.425535 kubelet[2731]: W0320 18:04:44.425530 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.425620 kubelet[2731]: E0320 18:04:44.425541 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.425772 kubelet[2731]: E0320 18:04:44.425758 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.425772 kubelet[2731]: W0320 18:04:44.425770 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.425846 kubelet[2731]: E0320 18:04:44.425779 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.426005 kubelet[2731]: E0320 18:04:44.425991 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.426005 kubelet[2731]: W0320 18:04:44.426002 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.426084 kubelet[2731]: E0320 18:04:44.426013 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.426349 kubelet[2731]: E0320 18:04:44.426332 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.426349 kubelet[2731]: W0320 18:04:44.426343 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.426589 kubelet[2731]: E0320 18:04:44.426356 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.426638 kubelet[2731]: E0320 18:04:44.426622 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.426688 kubelet[2731]: W0320 18:04:44.426636 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.426688 kubelet[2731]: E0320 18:04:44.426668 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.426913 kubelet[2731]: E0320 18:04:44.426897 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.426913 kubelet[2731]: W0320 18:04:44.426908 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.426993 kubelet[2731]: E0320 18:04:44.426924 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.427198 kubelet[2731]: E0320 18:04:44.427183 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.427198 kubelet[2731]: W0320 18:04:44.427194 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.427277 kubelet[2731]: E0320 18:04:44.427211 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.427483 kubelet[2731]: E0320 18:04:44.427463 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.427483 kubelet[2731]: W0320 18:04:44.427478 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.427550 kubelet[2731]: E0320 18:04:44.427493 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.427675 kubelet[2731]: E0320 18:04:44.427661 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.427675 kubelet[2731]: W0320 18:04:44.427671 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.427745 kubelet[2731]: E0320 18:04:44.427707 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.427903 kubelet[2731]: E0320 18:04:44.427885 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.427903 kubelet[2731]: W0320 18:04:44.427899 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.427976 kubelet[2731]: E0320 18:04:44.427931 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.428153 kubelet[2731]: E0320 18:04:44.428137 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.428153 kubelet[2731]: W0320 18:04:44.428149 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.428239 kubelet[2731]: E0320 18:04:44.428175 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.428413 kubelet[2731]: E0320 18:04:44.428397 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.428413 kubelet[2731]: W0320 18:04:44.428408 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.428503 kubelet[2731]: E0320 18:04:44.428426 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.428725 kubelet[2731]: E0320 18:04:44.428704 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.428725 kubelet[2731]: W0320 18:04:44.428717 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.428817 kubelet[2731]: E0320 18:04:44.428730 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.428917 kubelet[2731]: E0320 18:04:44.428900 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.428917 kubelet[2731]: W0320 18:04:44.428912 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.428978 kubelet[2731]: E0320 18:04:44.428929 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.429128 kubelet[2731]: E0320 18:04:44.429112 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.429128 kubelet[2731]: W0320 18:04:44.429123 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.429214 kubelet[2731]: E0320 18:04:44.429134 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.429397 kubelet[2731]: E0320 18:04:44.429378 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.429397 kubelet[2731]: W0320 18:04:44.429390 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.429480 kubelet[2731]: E0320 18:04:44.429403 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.429586 kubelet[2731]: E0320 18:04:44.429570 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.429586 kubelet[2731]: W0320 18:04:44.429581 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.429657 kubelet[2731]: E0320 18:04:44.429593 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.429831 kubelet[2731]: E0320 18:04:44.429814 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.429831 kubelet[2731]: W0320 18:04:44.429824 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.429895 kubelet[2731]: E0320 18:04:44.429836 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.430148 kubelet[2731]: E0320 18:04:44.430132 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.430148 kubelet[2731]: W0320 18:04:44.430146 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.430231 kubelet[2731]: E0320 18:04:44.430165 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.430424 kubelet[2731]: E0320 18:04:44.430409 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.430424 kubelet[2731]: W0320 18:04:44.430421 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.430505 kubelet[2731]: E0320 18:04:44.430438 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:44.430681 kubelet[2731]: E0320 18:04:44.430667 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:44.430681 kubelet[2731]: W0320 18:04:44.430678 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:44.430764 kubelet[2731]: E0320 18:04:44.430689 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.245337 kubelet[2731]: I0320 18:04:45.244842 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-64c5cd9bb6-n7pnb" podStartSLOduration=2.497038781 podStartE2EDuration="4.2448248s" podCreationTimestamp="2025-03-20 18:04:41 +0000 UTC" firstStartedPulling="2025-03-20 18:04:42.314097212 +0000 UTC m=+20.095262663" lastFinishedPulling="2025-03-20 18:04:44.061883231 +0000 UTC m=+21.843048682" observedRunningTime="2025-03-20 18:04:44.351738109 +0000 UTC m=+22.132903570" watchObservedRunningTime="2025-03-20 18:04:45.2448248 +0000 UTC m=+23.025990251" Mar 20 18:04:45.293076 kubelet[2731]: E0320 18:04:45.293018 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qn6tp" podUID="677f2c34-5e47-440b-986e-f493c61c5494" Mar 20 18:04:45.434545 kubelet[2731]: E0320 18:04:45.434491 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.434545 kubelet[2731]: W0320 18:04:45.434515 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.434545 kubelet[2731]: E0320 18:04:45.434539 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.435189 kubelet[2731]: E0320 18:04:45.434833 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.435189 kubelet[2731]: W0320 18:04:45.434841 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.435189 kubelet[2731]: E0320 18:04:45.434849 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.435189 kubelet[2731]: E0320 18:04:45.435144 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.435189 kubelet[2731]: W0320 18:04:45.435162 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.435189 kubelet[2731]: E0320 18:04:45.435183 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.435615 kubelet[2731]: E0320 18:04:45.435478 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.435615 kubelet[2731]: W0320 18:04:45.435489 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.435615 kubelet[2731]: E0320 18:04:45.435503 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.435783 kubelet[2731]: E0320 18:04:45.435763 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.435783 kubelet[2731]: W0320 18:04:45.435777 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.435862 kubelet[2731]: E0320 18:04:45.435788 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.436015 kubelet[2731]: E0320 18:04:45.435993 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.436015 kubelet[2731]: W0320 18:04:45.436014 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.436071 kubelet[2731]: E0320 18:04:45.436023 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.436212 kubelet[2731]: E0320 18:04:45.436198 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.436212 kubelet[2731]: W0320 18:04:45.436208 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.436267 kubelet[2731]: E0320 18:04:45.436216 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.436432 kubelet[2731]: E0320 18:04:45.436414 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.436432 kubelet[2731]: W0320 18:04:45.436425 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.436432 kubelet[2731]: E0320 18:04:45.436433 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.436843 kubelet[2731]: E0320 18:04:45.436736 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.436843 kubelet[2731]: W0320 18:04:45.436747 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.436843 kubelet[2731]: E0320 18:04:45.436756 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.437032 kubelet[2731]: E0320 18:04:45.436972 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.437032 kubelet[2731]: W0320 18:04:45.436985 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.437032 kubelet[2731]: E0320 18:04:45.436999 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.437222 kubelet[2731]: E0320 18:04:45.437197 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.437222 kubelet[2731]: W0320 18:04:45.437207 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.437295 kubelet[2731]: E0320 18:04:45.437231 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.437472 kubelet[2731]: E0320 18:04:45.437434 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.437472 kubelet[2731]: W0320 18:04:45.437446 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.437472 kubelet[2731]: E0320 18:04:45.437467 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.437699 kubelet[2731]: E0320 18:04:45.437685 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.437699 kubelet[2731]: W0320 18:04:45.437696 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.437847 kubelet[2731]: E0320 18:04:45.437706 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.437952 kubelet[2731]: E0320 18:04:45.437937 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.437952 kubelet[2731]: W0320 18:04:45.437948 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.438030 kubelet[2731]: E0320 18:04:45.437958 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.438178 kubelet[2731]: E0320 18:04:45.438163 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.438178 kubelet[2731]: W0320 18:04:45.438174 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.438256 kubelet[2731]: E0320 18:04:45.438184 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.438479 kubelet[2731]: E0320 18:04:45.438443 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.438479 kubelet[2731]: W0320 18:04:45.438469 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.438479 kubelet[2731]: E0320 18:04:45.438482 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.438746 kubelet[2731]: E0320 18:04:45.438730 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.438746 kubelet[2731]: W0320 18:04:45.438742 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.438814 kubelet[2731]: E0320 18:04:45.438757 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.439122 kubelet[2731]: E0320 18:04:45.439102 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.439122 kubelet[2731]: W0320 18:04:45.439116 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.439235 kubelet[2731]: E0320 18:04:45.439131 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.439398 kubelet[2731]: E0320 18:04:45.439381 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.439398 kubelet[2731]: W0320 18:04:45.439394 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.439398 kubelet[2731]: E0320 18:04:45.439408 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.439591 kubelet[2731]: E0320 18:04:45.439577 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.439591 kubelet[2731]: W0320 18:04:45.439587 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.439664 kubelet[2731]: E0320 18:04:45.439594 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.439813 kubelet[2731]: E0320 18:04:45.439787 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.439813 kubelet[2731]: W0320 18:04:45.439800 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.439891 kubelet[2731]: E0320 18:04:45.439817 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.440069 kubelet[2731]: E0320 18:04:45.440057 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.440096 kubelet[2731]: W0320 18:04:45.440068 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.440134 kubelet[2731]: E0320 18:04:45.440104 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.440329 kubelet[2731]: E0320 18:04:45.440289 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.440329 kubelet[2731]: W0320 18:04:45.440315 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.440593 kubelet[2731]: E0320 18:04:45.440350 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.440593 kubelet[2731]: E0320 18:04:45.440537 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.440593 kubelet[2731]: W0320 18:04:45.440545 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.440593 kubelet[2731]: E0320 18:04:45.440559 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.440797 kubelet[2731]: E0320 18:04:45.440762 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.440797 kubelet[2731]: W0320 18:04:45.440774 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.440797 kubelet[2731]: E0320 18:04:45.440790 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.441017 kubelet[2731]: E0320 18:04:45.440999 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.441017 kubelet[2731]: W0320 18:04:45.441013 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.441069 kubelet[2731]: E0320 18:04:45.441029 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.441249 kubelet[2731]: E0320 18:04:45.441237 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.441275 kubelet[2731]: W0320 18:04:45.441249 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.441275 kubelet[2731]: E0320 18:04:45.441265 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.441500 kubelet[2731]: E0320 18:04:45.441487 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.441500 kubelet[2731]: W0320 18:04:45.441497 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.441554 kubelet[2731]: E0320 18:04:45.441512 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.441694 kubelet[2731]: E0320 18:04:45.441682 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.441694 kubelet[2731]: W0320 18:04:45.441692 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.441753 kubelet[2731]: E0320 18:04:45.441704 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.442091 kubelet[2731]: E0320 18:04:45.442077 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.442119 kubelet[2731]: W0320 18:04:45.442090 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.442119 kubelet[2731]: E0320 18:04:45.442105 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.442533 kubelet[2731]: E0320 18:04:45.442500 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.442533 kubelet[2731]: W0320 18:04:45.442529 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.442635 kubelet[2731]: E0320 18:04:45.442547 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.442949 kubelet[2731]: E0320 18:04:45.442924 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.442996 kubelet[2731]: W0320 18:04:45.442950 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.442996 kubelet[2731]: E0320 18:04:45.442967 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:45.443183 kubelet[2731]: E0320 18:04:45.443167 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:45.443183 kubelet[2731]: W0320 18:04:45.443180 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:45.443240 kubelet[2731]: E0320 18:04:45.443190 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.008966 containerd[1505]: time="2025-03-20T18:04:46.008920100Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:46.030710 containerd[1505]: time="2025-03-20T18:04:46.030625539Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 20 18:04:46.046714 containerd[1505]: time="2025-03-20T18:04:46.046630855Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:46.056343 containerd[1505]: time="2025-03-20T18:04:46.056281840Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:46.056841 containerd[1505]: time="2025-03-20T18:04:46.056808465Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.994767716s" Mar 20 18:04:46.056841 containerd[1505]: time="2025-03-20T18:04:46.056836348Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 20 18:04:46.058978 containerd[1505]: time="2025-03-20T18:04:46.058948136Z" level=info msg="CreateContainer within sandbox \"26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 18:04:46.266229 systemd[1]: Started sshd@7-10.0.0.124:22-10.0.0.1:58670.service - OpenSSH per-connection server daemon (10.0.0.1:58670). Mar 20 18:04:46.315864 containerd[1505]: time="2025-03-20T18:04:46.315807086Z" level=info msg="Container e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:46.319399 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1664517030.mount: Deactivated successfully. Mar 20 18:04:46.362723 sshd[3401]: Accepted publickey for core from 10.0.0.1 port 58670 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:46.364346 sshd-session[3401]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:46.368793 systemd-logind[1485]: New session 8 of user core. Mar 20 18:04:46.377542 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 20 18:04:46.387257 containerd[1505]: time="2025-03-20T18:04:46.387198007Z" level=info msg="CreateContainer within sandbox \"26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a\"" Mar 20 18:04:46.388063 containerd[1505]: time="2025-03-20T18:04:46.388025610Z" level=info msg="StartContainer for \"e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a\"" Mar 20 18:04:46.390929 containerd[1505]: time="2025-03-20T18:04:46.390490013Z" level=info msg="connecting to shim e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a" address="unix:///run/containerd/s/067aa699b6852fe72d2ec2aa6cbe95bb25744097510f8df2beb0c5055be07e6a" protocol=ttrpc version=3 Mar 20 18:04:46.420435 systemd[1]: Started cri-containerd-e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a.scope - libcontainer container e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a. Mar 20 18:04:46.445139 kubelet[2731]: E0320 18:04:46.445114 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.445782 kubelet[2731]: W0320 18:04:46.445638 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.445782 kubelet[2731]: E0320 18:04:46.445666 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.445946 kubelet[2731]: E0320 18:04:46.445935 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.446045 kubelet[2731]: W0320 18:04:46.445991 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.446045 kubelet[2731]: E0320 18:04:46.446003 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.446458 kubelet[2731]: E0320 18:04:46.446332 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.446458 kubelet[2731]: W0320 18:04:46.446343 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.446458 kubelet[2731]: E0320 18:04:46.446351 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.449461 kubelet[2731]: E0320 18:04:46.449328 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.449461 kubelet[2731]: W0320 18:04:46.449347 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.449461 kubelet[2731]: E0320 18:04:46.449357 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.449785 kubelet[2731]: E0320 18:04:46.449698 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.449785 kubelet[2731]: W0320 18:04:46.449708 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.449785 kubelet[2731]: E0320 18:04:46.449718 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.452396 kubelet[2731]: E0320 18:04:46.452380 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.452597 kubelet[2731]: W0320 18:04:46.452480 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.452597 kubelet[2731]: E0320 18:04:46.452498 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.454399 kubelet[2731]: E0320 18:04:46.454335 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.454399 kubelet[2731]: W0320 18:04:46.454346 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.454399 kubelet[2731]: E0320 18:04:46.454356 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.456362 kubelet[2731]: E0320 18:04:46.455037 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.456362 kubelet[2731]: W0320 18:04:46.455049 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.456362 kubelet[2731]: E0320 18:04:46.455061 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.456362 kubelet[2731]: E0320 18:04:46.455497 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.456362 kubelet[2731]: W0320 18:04:46.455505 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.456362 kubelet[2731]: E0320 18:04:46.455514 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.456362 kubelet[2731]: E0320 18:04:46.456082 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.456362 kubelet[2731]: W0320 18:04:46.456092 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.456362 kubelet[2731]: E0320 18:04:46.456101 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.458259 kubelet[2731]: E0320 18:04:46.458226 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.458331 kubelet[2731]: W0320 18:04:46.458256 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.458331 kubelet[2731]: E0320 18:04:46.458282 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.458603 kubelet[2731]: E0320 18:04:46.458589 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.458754 kubelet[2731]: W0320 18:04:46.458654 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.458754 kubelet[2731]: E0320 18:04:46.458668 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.459636 kubelet[2731]: E0320 18:04:46.459623 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.459999 kubelet[2731]: W0320 18:04:46.459693 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.460134 kubelet[2731]: E0320 18:04:46.460044 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.460774 kubelet[2731]: E0320 18:04:46.460762 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.460877 kubelet[2731]: W0320 18:04:46.460822 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.460877 kubelet[2731]: E0320 18:04:46.460834 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.461698 kubelet[2731]: E0320 18:04:46.461544 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.461698 kubelet[2731]: W0320 18:04:46.461557 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.461698 kubelet[2731]: E0320 18:04:46.461566 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.462101 kubelet[2731]: E0320 18:04:46.462087 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.462284 kubelet[2731]: W0320 18:04:46.462160 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.462284 kubelet[2731]: E0320 18:04:46.462192 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.463009 kubelet[2731]: E0320 18:04:46.462985 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.463054 kubelet[2731]: W0320 18:04:46.463013 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.463054 kubelet[2731]: E0320 18:04:46.463029 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.463707 kubelet[2731]: E0320 18:04:46.463682 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.463707 kubelet[2731]: W0320 18:04:46.463704 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.463832 kubelet[2731]: E0320 18:04:46.463804 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.464336 kubelet[2731]: E0320 18:04:46.464143 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.464336 kubelet[2731]: W0320 18:04:46.464181 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.464336 kubelet[2731]: E0320 18:04:46.464195 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.465073 kubelet[2731]: E0320 18:04:46.464684 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.465073 kubelet[2731]: W0320 18:04:46.464699 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.465073 kubelet[2731]: E0320 18:04:46.464763 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.465073 kubelet[2731]: E0320 18:04:46.464994 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.465073 kubelet[2731]: W0320 18:04:46.465016 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.465404 kubelet[2731]: E0320 18:04:46.465155 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.465647 kubelet[2731]: E0320 18:04:46.465520 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.465647 kubelet[2731]: W0320 18:04:46.465533 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.465647 kubelet[2731]: E0320 18:04:46.465551 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.466038 kubelet[2731]: E0320 18:04:46.466018 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.466200 kubelet[2731]: W0320 18:04:46.466088 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.466378 kubelet[2731]: E0320 18:04:46.466360 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.466378 kubelet[2731]: W0320 18:04:46.466376 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.466473 kubelet[2731]: E0320 18:04:46.466207 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.466473 kubelet[2731]: E0320 18:04:46.466467 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.467434 kubelet[2731]: E0320 18:04:46.467356 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.467434 kubelet[2731]: W0320 18:04:46.467368 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.467434 kubelet[2731]: E0320 18:04:46.467387 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.467758 kubelet[2731]: E0320 18:04:46.467720 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.467758 kubelet[2731]: W0320 18:04:46.467734 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.467842 kubelet[2731]: E0320 18:04:46.467776 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.467995 kubelet[2731]: E0320 18:04:46.467973 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.468126 kubelet[2731]: W0320 18:04:46.468051 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.468126 kubelet[2731]: E0320 18:04:46.468114 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.469199 kubelet[2731]: E0320 18:04:46.469167 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.469199 kubelet[2731]: W0320 18:04:46.469186 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.469727 kubelet[2731]: E0320 18:04:46.469234 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.470434 kubelet[2731]: E0320 18:04:46.470400 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.470434 kubelet[2731]: W0320 18:04:46.470415 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.470561 kubelet[2731]: E0320 18:04:46.470535 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.471608 kubelet[2731]: E0320 18:04:46.471474 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.471848 kubelet[2731]: W0320 18:04:46.471712 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.472075 kubelet[2731]: E0320 18:04:46.472005 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.472075 kubelet[2731]: W0320 18:04:46.472024 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.472427 kubelet[2731]: E0320 18:04:46.472158 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.472427 kubelet[2731]: E0320 18:04:46.472170 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.472427 kubelet[2731]: E0320 18:04:46.472384 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.472427 kubelet[2731]: W0320 18:04:46.472393 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.472427 kubelet[2731]: E0320 18:04:46.472405 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.473417 kubelet[2731]: E0320 18:04:46.473389 2731 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 18:04:46.473521 kubelet[2731]: W0320 18:04:46.473504 2731 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 18:04:46.473630 kubelet[2731]: E0320 18:04:46.473595 2731 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 18:04:46.480565 containerd[1505]: time="2025-03-20T18:04:46.480528919Z" level=info msg="StartContainer for \"e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a\" returns successfully" Mar 20 18:04:46.503815 systemd[1]: cri-containerd-e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a.scope: Deactivated successfully. Mar 20 18:04:46.504143 systemd[1]: cri-containerd-e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a.scope: Consumed 46ms CPU time, 8.1M memory peak, 6.1M written to disk. Mar 20 18:04:46.508224 containerd[1505]: time="2025-03-20T18:04:46.508178415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a\" id:\"e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a\" pid:3437 exited_at:{seconds:1742493886 nanos:506820591}" Mar 20 18:04:46.508641 containerd[1505]: time="2025-03-20T18:04:46.508623846Z" level=info msg="received exit event container_id:\"e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a\" id:\"e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a\" pid:3437 exited_at:{seconds:1742493886 nanos:506820591}" Mar 20 18:04:46.537530 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e42eae0a2f9ecca061127a0e7659316b80a9890e6636f8e80b6497e81217427a-rootfs.mount: Deactivated successfully. Mar 20 18:04:46.541249 sshd[3409]: Connection closed by 10.0.0.1 port 58670 Mar 20 18:04:46.542126 sshd-session[3401]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:46.546270 systemd[1]: sshd@7-10.0.0.124:22-10.0.0.1:58670.service: Deactivated successfully. Mar 20 18:04:46.548978 systemd[1]: session-8.scope: Deactivated successfully. Mar 20 18:04:46.550878 systemd-logind[1485]: Session 8 logged out. Waiting for processes to exit. Mar 20 18:04:46.552043 systemd-logind[1485]: Removed session 8. Mar 20 18:04:47.293488 kubelet[2731]: E0320 18:04:47.293447 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qn6tp" podUID="677f2c34-5e47-440b-986e-f493c61c5494" Mar 20 18:04:47.348922 containerd[1505]: time="2025-03-20T18:04:47.348797132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 20 18:04:49.293362 kubelet[2731]: E0320 18:04:49.293281 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qn6tp" podUID="677f2c34-5e47-440b-986e-f493c61c5494" Mar 20 18:04:51.294368 kubelet[2731]: E0320 18:04:51.293317 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qn6tp" podUID="677f2c34-5e47-440b-986e-f493c61c5494" Mar 20 18:04:51.562695 systemd[1]: Started sshd@8-10.0.0.124:22-10.0.0.1:54860.service - OpenSSH per-connection server daemon (10.0.0.1:54860). Mar 20 18:04:51.590058 containerd[1505]: time="2025-03-20T18:04:51.589996896Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:51.591033 containerd[1505]: time="2025-03-20T18:04:51.590955704Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 20 18:04:51.592019 containerd[1505]: time="2025-03-20T18:04:51.591982690Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:51.593923 containerd[1505]: time="2025-03-20T18:04:51.593886619Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:51.594575 containerd[1505]: time="2025-03-20T18:04:51.594516847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 4.245681212s" Mar 20 18:04:51.594575 containerd[1505]: time="2025-03-20T18:04:51.594549128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 20 18:04:51.616017 containerd[1505]: time="2025-03-20T18:04:51.615979846Z" level=info msg="CreateContainer within sandbox \"26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 18:04:51.618610 sshd[3529]: Accepted publickey for core from 10.0.0.1 port 54860 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:51.620296 sshd-session[3529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:51.625269 systemd-logind[1485]: New session 9 of user core. Mar 20 18:04:51.625794 containerd[1505]: time="2025-03-20T18:04:51.625604399Z" level=info msg="Container 69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:51.634192 containerd[1505]: time="2025-03-20T18:04:51.634149146Z" level=info msg="CreateContainer within sandbox \"26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac\"" Mar 20 18:04:51.634519 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 20 18:04:51.637694 containerd[1505]: time="2025-03-20T18:04:51.637653822Z" level=info msg="StartContainer for \"69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac\"" Mar 20 18:04:51.642214 containerd[1505]: time="2025-03-20T18:04:51.640411811Z" level=info msg="connecting to shim 69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac" address="unix:///run/containerd/s/067aa699b6852fe72d2ec2aa6cbe95bb25744097510f8df2beb0c5055be07e6a" protocol=ttrpc version=3 Mar 20 18:04:51.676504 systemd[1]: Started cri-containerd-69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac.scope - libcontainer container 69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac. Mar 20 18:04:51.812675 containerd[1505]: time="2025-03-20T18:04:51.812628760Z" level=info msg="StartContainer for \"69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac\" returns successfully" Mar 20 18:04:51.831364 sshd[3535]: Connection closed by 10.0.0.1 port 54860 Mar 20 18:04:51.832551 sshd-session[3529]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:51.837079 systemd[1]: sshd@8-10.0.0.124:22-10.0.0.1:54860.service: Deactivated successfully. Mar 20 18:04:51.840709 systemd[1]: session-9.scope: Deactivated successfully. Mar 20 18:04:51.841849 systemd-logind[1485]: Session 9 logged out. Waiting for processes to exit. Mar 20 18:04:51.842938 systemd-logind[1485]: Removed session 9. Mar 20 18:04:52.758743 systemd[1]: cri-containerd-69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac.scope: Deactivated successfully. Mar 20 18:04:52.759251 systemd[1]: cri-containerd-69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac.scope: Consumed 581ms CPU time, 162.5M memory peak, 8K read from disk, 154M written to disk. Mar 20 18:04:52.760264 containerd[1505]: time="2025-03-20T18:04:52.760216867Z" level=info msg="received exit event container_id:\"69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac\" id:\"69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac\" pid:3549 exited_at:{seconds:1742493892 nanos:760035185}" Mar 20 18:04:52.760616 containerd[1505]: time="2025-03-20T18:04:52.760282861Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac\" id:\"69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac\" pid:3549 exited_at:{seconds:1742493892 nanos:760035185}" Mar 20 18:04:52.764831 kubelet[2731]: I0320 18:04:52.764797 2731 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 20 18:04:52.782943 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-69c6e50b45a901e5db4f1b750340a56101b5873fe8150509ae28735cb46de9ac-rootfs.mount: Deactivated successfully. Mar 20 18:04:52.792747 kubelet[2731]: I0320 18:04:52.792621 2731 topology_manager.go:215] "Topology Admit Handler" podUID="c4f851e4-8e7f-4611-a69f-58c7c3c807e5" podNamespace="kube-system" podName="coredns-7db6d8ff4d-h9gll" Mar 20 18:04:52.792950 kubelet[2731]: I0320 18:04:52.792835 2731 topology_manager.go:215] "Topology Admit Handler" podUID="82be44a1-17f0-4e51-b6a3-0a32047afbeb" podNamespace="calico-apiserver" podName="calico-apiserver-85cbf4758c-xz74x" Mar 20 18:04:52.792950 kubelet[2731]: I0320 18:04:52.792902 2731 topology_manager.go:215] "Topology Admit Handler" podUID="1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1" podNamespace="kube-system" podName="coredns-7db6d8ff4d-r2f89" Mar 20 18:04:52.793139 kubelet[2731]: I0320 18:04:52.792973 2731 topology_manager.go:215] "Topology Admit Handler" podUID="c0d15fad-9029-4549-86c7-afeb3307ce00" podNamespace="calico-apiserver" podName="calico-apiserver-85cbf4758c-jbp62" Mar 20 18:04:52.793139 kubelet[2731]: I0320 18:04:52.793046 2731 topology_manager.go:215] "Topology Admit Handler" podUID="035053f4-6153-4569-a465-c4bae5aa605d" podNamespace="calico-system" podName="calico-kube-controllers-6c598c664d-vl2f9" Mar 20 18:04:52.801328 systemd[1]: Created slice kubepods-besteffort-pod035053f4_6153_4569_a465_c4bae5aa605d.slice - libcontainer container kubepods-besteffort-pod035053f4_6153_4569_a465_c4bae5aa605d.slice. Mar 20 18:04:52.806128 systemd[1]: Created slice kubepods-besteffort-pod82be44a1_17f0_4e51_b6a3_0a32047afbeb.slice - libcontainer container kubepods-besteffort-pod82be44a1_17f0_4e51_b6a3_0a32047afbeb.slice. Mar 20 18:04:52.810648 systemd[1]: Created slice kubepods-burstable-podc4f851e4_8e7f_4611_a69f_58c7c3c807e5.slice - libcontainer container kubepods-burstable-podc4f851e4_8e7f_4611_a69f_58c7c3c807e5.slice. Mar 20 18:04:52.816421 systemd[1]: Created slice kubepods-burstable-pod1c8b2dde_8013_4cf0_a170_5ffd9e11c2e1.slice - libcontainer container kubepods-burstable-pod1c8b2dde_8013_4cf0_a170_5ffd9e11c2e1.slice. Mar 20 18:04:52.821960 systemd[1]: Created slice kubepods-besteffort-podc0d15fad_9029_4549_86c7_afeb3307ce00.slice - libcontainer container kubepods-besteffort-podc0d15fad_9029_4549_86c7_afeb3307ce00.slice. Mar 20 18:04:52.955854 kubelet[2731]: I0320 18:04:52.955807 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckwr\" (UniqueName: \"kubernetes.io/projected/c0d15fad-9029-4549-86c7-afeb3307ce00-kube-api-access-gckwr\") pod \"calico-apiserver-85cbf4758c-jbp62\" (UID: \"c0d15fad-9029-4549-86c7-afeb3307ce00\") " pod="calico-apiserver/calico-apiserver-85cbf4758c-jbp62" Mar 20 18:04:52.955854 kubelet[2731]: I0320 18:04:52.955849 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb9gh\" (UniqueName: \"kubernetes.io/projected/035053f4-6153-4569-a465-c4bae5aa605d-kube-api-access-wb9gh\") pod \"calico-kube-controllers-6c598c664d-vl2f9\" (UID: \"035053f4-6153-4569-a465-c4bae5aa605d\") " pod="calico-system/calico-kube-controllers-6c598c664d-vl2f9" Mar 20 18:04:52.955854 kubelet[2731]: I0320 18:04:52.955873 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx5qj\" (UniqueName: \"kubernetes.io/projected/c4f851e4-8e7f-4611-a69f-58c7c3c807e5-kube-api-access-jx5qj\") pod \"coredns-7db6d8ff4d-h9gll\" (UID: \"c4f851e4-8e7f-4611-a69f-58c7c3c807e5\") " pod="kube-system/coredns-7db6d8ff4d-h9gll" Mar 20 18:04:52.956080 kubelet[2731]: I0320 18:04:52.955891 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/82be44a1-17f0-4e51-b6a3-0a32047afbeb-calico-apiserver-certs\") pod \"calico-apiserver-85cbf4758c-xz74x\" (UID: \"82be44a1-17f0-4e51-b6a3-0a32047afbeb\") " pod="calico-apiserver/calico-apiserver-85cbf4758c-xz74x" Mar 20 18:04:52.956080 kubelet[2731]: I0320 18:04:52.955906 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/035053f4-6153-4569-a465-c4bae5aa605d-tigera-ca-bundle\") pod \"calico-kube-controllers-6c598c664d-vl2f9\" (UID: \"035053f4-6153-4569-a465-c4bae5aa605d\") " pod="calico-system/calico-kube-controllers-6c598c664d-vl2f9" Mar 20 18:04:52.956080 kubelet[2731]: I0320 18:04:52.955953 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1-config-volume\") pod \"coredns-7db6d8ff4d-r2f89\" (UID: \"1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1\") " pod="kube-system/coredns-7db6d8ff4d-r2f89" Mar 20 18:04:52.956080 kubelet[2731]: I0320 18:04:52.955977 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c0d15fad-9029-4549-86c7-afeb3307ce00-calico-apiserver-certs\") pod \"calico-apiserver-85cbf4758c-jbp62\" (UID: \"c0d15fad-9029-4549-86c7-afeb3307ce00\") " pod="calico-apiserver/calico-apiserver-85cbf4758c-jbp62" Mar 20 18:04:52.956080 kubelet[2731]: I0320 18:04:52.955997 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7xw5\" (UniqueName: \"kubernetes.io/projected/82be44a1-17f0-4e51-b6a3-0a32047afbeb-kube-api-access-l7xw5\") pod \"calico-apiserver-85cbf4758c-xz74x\" (UID: \"82be44a1-17f0-4e51-b6a3-0a32047afbeb\") " pod="calico-apiserver/calico-apiserver-85cbf4758c-xz74x" Mar 20 18:04:52.956203 kubelet[2731]: I0320 18:04:52.956053 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4f851e4-8e7f-4611-a69f-58c7c3c807e5-config-volume\") pod \"coredns-7db6d8ff4d-h9gll\" (UID: \"c4f851e4-8e7f-4611-a69f-58c7c3c807e5\") " pod="kube-system/coredns-7db6d8ff4d-h9gll" Mar 20 18:04:52.956203 kubelet[2731]: I0320 18:04:52.956068 2731 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8sp\" (UniqueName: \"kubernetes.io/projected/1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1-kube-api-access-tw8sp\") pod \"coredns-7db6d8ff4d-r2f89\" (UID: \"1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1\") " pod="kube-system/coredns-7db6d8ff4d-r2f89" Mar 20 18:04:53.105254 containerd[1505]: time="2025-03-20T18:04:53.105211352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c598c664d-vl2f9,Uid:035053f4-6153-4569-a465-c4bae5aa605d,Namespace:calico-system,Attempt:0,}" Mar 20 18:04:53.108776 containerd[1505]: time="2025-03-20T18:04:53.108732274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-xz74x,Uid:82be44a1-17f0-4e51-b6a3-0a32047afbeb,Namespace:calico-apiserver,Attempt:0,}" Mar 20 18:04:53.115471 containerd[1505]: time="2025-03-20T18:04:53.115433910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-h9gll,Uid:c4f851e4-8e7f-4611-a69f-58c7c3c807e5,Namespace:kube-system,Attempt:0,}" Mar 20 18:04:53.120674 containerd[1505]: time="2025-03-20T18:04:53.120627285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-r2f89,Uid:1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1,Namespace:kube-system,Attempt:0,}" Mar 20 18:04:53.124319 containerd[1505]: time="2025-03-20T18:04:53.124248406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-jbp62,Uid:c0d15fad-9029-4549-86c7-afeb3307ce00,Namespace:calico-apiserver,Attempt:0,}" Mar 20 18:04:53.187471 containerd[1505]: time="2025-03-20T18:04:53.187428586Z" level=error msg="Failed to destroy network for sandbox \"f8a3218258553ae7fc39f73c17994ce0c930fb8f417d3cb290954770e647dd4f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.188881 containerd[1505]: time="2025-03-20T18:04:53.188850124Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c598c664d-vl2f9,Uid:035053f4-6153-4569-a465-c4bae5aa605d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a3218258553ae7fc39f73c17994ce0c930fb8f417d3cb290954770e647dd4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.189124 kubelet[2731]: E0320 18:04:53.189079 2731 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a3218258553ae7fc39f73c17994ce0c930fb8f417d3cb290954770e647dd4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.189195 kubelet[2731]: E0320 18:04:53.189150 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a3218258553ae7fc39f73c17994ce0c930fb8f417d3cb290954770e647dd4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c598c664d-vl2f9" Mar 20 18:04:53.189195 kubelet[2731]: E0320 18:04:53.189171 2731 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8a3218258553ae7fc39f73c17994ce0c930fb8f417d3cb290954770e647dd4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c598c664d-vl2f9" Mar 20 18:04:53.189279 kubelet[2731]: E0320 18:04:53.189208 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6c598c664d-vl2f9_calico-system(035053f4-6153-4569-a465-c4bae5aa605d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6c598c664d-vl2f9_calico-system(035053f4-6153-4569-a465-c4bae5aa605d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f8a3218258553ae7fc39f73c17994ce0c930fb8f417d3cb290954770e647dd4f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6c598c664d-vl2f9" podUID="035053f4-6153-4569-a465-c4bae5aa605d" Mar 20 18:04:53.208276 containerd[1505]: time="2025-03-20T18:04:53.208009299Z" level=error msg="Failed to destroy network for sandbox \"12c31dc97cdc93fcefb56e951286b53cf4009a7a9a237cbca07a377c41fe6447\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.209539 containerd[1505]: time="2025-03-20T18:04:53.209494567Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-xz74x,Uid:82be44a1-17f0-4e51-b6a3-0a32047afbeb,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12c31dc97cdc93fcefb56e951286b53cf4009a7a9a237cbca07a377c41fe6447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.209751 kubelet[2731]: E0320 18:04:53.209707 2731 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12c31dc97cdc93fcefb56e951286b53cf4009a7a9a237cbca07a377c41fe6447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.209802 kubelet[2731]: E0320 18:04:53.209768 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12c31dc97cdc93fcefb56e951286b53cf4009a7a9a237cbca07a377c41fe6447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cbf4758c-xz74x" Mar 20 18:04:53.209802 kubelet[2731]: E0320 18:04:53.209786 2731 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12c31dc97cdc93fcefb56e951286b53cf4009a7a9a237cbca07a377c41fe6447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cbf4758c-xz74x" Mar 20 18:04:53.209860 kubelet[2731]: E0320 18:04:53.209820 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85cbf4758c-xz74x_calico-apiserver(82be44a1-17f0-4e51-b6a3-0a32047afbeb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85cbf4758c-xz74x_calico-apiserver(82be44a1-17f0-4e51-b6a3-0a32047afbeb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12c31dc97cdc93fcefb56e951286b53cf4009a7a9a237cbca07a377c41fe6447\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85cbf4758c-xz74x" podUID="82be44a1-17f0-4e51-b6a3-0a32047afbeb" Mar 20 18:04:53.212899 containerd[1505]: time="2025-03-20T18:04:53.212861270Z" level=error msg="Failed to destroy network for sandbox \"60bf7da1f338d171ba2da933060043f0613ee723de28d4b86813ea503db85024\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.214208 containerd[1505]: time="2025-03-20T18:04:53.214167520Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-jbp62,Uid:c0d15fad-9029-4549-86c7-afeb3307ce00,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"60bf7da1f338d171ba2da933060043f0613ee723de28d4b86813ea503db85024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.214692 kubelet[2731]: E0320 18:04:53.214662 2731 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60bf7da1f338d171ba2da933060043f0613ee723de28d4b86813ea503db85024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.214781 kubelet[2731]: E0320 18:04:53.214707 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60bf7da1f338d171ba2da933060043f0613ee723de28d4b86813ea503db85024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cbf4758c-jbp62" Mar 20 18:04:53.214781 kubelet[2731]: E0320 18:04:53.214723 2731 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60bf7da1f338d171ba2da933060043f0613ee723de28d4b86813ea503db85024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85cbf4758c-jbp62" Mar 20 18:04:53.214781 kubelet[2731]: E0320 18:04:53.214751 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85cbf4758c-jbp62_calico-apiserver(c0d15fad-9029-4549-86c7-afeb3307ce00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85cbf4758c-jbp62_calico-apiserver(c0d15fad-9029-4549-86c7-afeb3307ce00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60bf7da1f338d171ba2da933060043f0613ee723de28d4b86813ea503db85024\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85cbf4758c-jbp62" podUID="c0d15fad-9029-4549-86c7-afeb3307ce00" Mar 20 18:04:53.216119 containerd[1505]: time="2025-03-20T18:04:53.216093169Z" level=error msg="Failed to destroy network for sandbox \"0bbab2ece114f866956204f190ad891bd3c4e5bb812b5c764f0a15e0c2903d92\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.217493 containerd[1505]: time="2025-03-20T18:04:53.217456226Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-h9gll,Uid:c4f851e4-8e7f-4611-a69f-58c7c3c807e5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bbab2ece114f866956204f190ad891bd3c4e5bb812b5c764f0a15e0c2903d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.217720 kubelet[2731]: E0320 18:04:53.217681 2731 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bbab2ece114f866956204f190ad891bd3c4e5bb812b5c764f0a15e0c2903d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.217720 kubelet[2731]: E0320 18:04:53.217710 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bbab2ece114f866956204f190ad891bd3c4e5bb812b5c764f0a15e0c2903d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-h9gll" Mar 20 18:04:53.217791 kubelet[2731]: E0320 18:04:53.217724 2731 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bbab2ece114f866956204f190ad891bd3c4e5bb812b5c764f0a15e0c2903d92\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-h9gll" Mar 20 18:04:53.217791 kubelet[2731]: E0320 18:04:53.217749 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-h9gll_kube-system(c4f851e4-8e7f-4611-a69f-58c7c3c807e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-h9gll_kube-system(c4f851e4-8e7f-4611-a69f-58c7c3c807e5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0bbab2ece114f866956204f190ad891bd3c4e5bb812b5c764f0a15e0c2903d92\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-h9gll" podUID="c4f851e4-8e7f-4611-a69f-58c7c3c807e5" Mar 20 18:04:53.219171 containerd[1505]: time="2025-03-20T18:04:53.219128928Z" level=error msg="Failed to destroy network for sandbox \"06933310daa28b8fe1c683b974458d4232767a0a690cc4b5c372d2e5697cc8a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.220245 containerd[1505]: time="2025-03-20T18:04:53.220218030Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-r2f89,Uid:1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"06933310daa28b8fe1c683b974458d4232767a0a690cc4b5c372d2e5697cc8a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.220391 kubelet[2731]: E0320 18:04:53.220368 2731 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06933310daa28b8fe1c683b974458d4232767a0a690cc4b5c372d2e5697cc8a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.220430 kubelet[2731]: E0320 18:04:53.220395 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06933310daa28b8fe1c683b974458d4232767a0a690cc4b5c372d2e5697cc8a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-r2f89" Mar 20 18:04:53.220430 kubelet[2731]: E0320 18:04:53.220410 2731 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06933310daa28b8fe1c683b974458d4232767a0a690cc4b5c372d2e5697cc8a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-r2f89" Mar 20 18:04:53.220485 kubelet[2731]: E0320 18:04:53.220452 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-r2f89_kube-system(1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-r2f89_kube-system(1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"06933310daa28b8fe1c683b974458d4232767a0a690cc4b5c372d2e5697cc8a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-r2f89" podUID="1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1" Mar 20 18:04:53.298529 systemd[1]: Created slice kubepods-besteffort-pod677f2c34_5e47_440b_986e_f493c61c5494.slice - libcontainer container kubepods-besteffort-pod677f2c34_5e47_440b_986e_f493c61c5494.slice. Mar 20 18:04:53.300685 containerd[1505]: time="2025-03-20T18:04:53.300656102Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qn6tp,Uid:677f2c34-5e47-440b-986e-f493c61c5494,Namespace:calico-system,Attempt:0,}" Mar 20 18:04:53.347051 containerd[1505]: time="2025-03-20T18:04:53.346989453Z" level=error msg="Failed to destroy network for sandbox \"a659da69cf0e40f4e479bb54c58cd3c7126b07e3d9916a08755314a600ce0e59\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.348232 containerd[1505]: time="2025-03-20T18:04:53.348186768Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qn6tp,Uid:677f2c34-5e47-440b-986e-f493c61c5494,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a659da69cf0e40f4e479bb54c58cd3c7126b07e3d9916a08755314a600ce0e59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.348475 kubelet[2731]: E0320 18:04:53.348427 2731 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a659da69cf0e40f4e479bb54c58cd3c7126b07e3d9916a08755314a600ce0e59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 18:04:53.348527 kubelet[2731]: E0320 18:04:53.348490 2731 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a659da69cf0e40f4e479bb54c58cd3c7126b07e3d9916a08755314a600ce0e59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qn6tp" Mar 20 18:04:53.348527 kubelet[2731]: E0320 18:04:53.348510 2731 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a659da69cf0e40f4e479bb54c58cd3c7126b07e3d9916a08755314a600ce0e59\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qn6tp" Mar 20 18:04:53.348663 kubelet[2731]: E0320 18:04:53.348554 2731 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-qn6tp_calico-system(677f2c34-5e47-440b-986e-f493c61c5494)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-qn6tp_calico-system(677f2c34-5e47-440b-986e-f493c61c5494)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a659da69cf0e40f4e479bb54c58cd3c7126b07e3d9916a08755314a600ce0e59\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-qn6tp" podUID="677f2c34-5e47-440b-986e-f493c61c5494" Mar 20 18:04:53.467471 containerd[1505]: time="2025-03-20T18:04:53.467322165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 20 18:04:56.844743 systemd[1]: Started sshd@9-10.0.0.124:22-10.0.0.1:54862.service - OpenSSH per-connection server daemon (10.0.0.1:54862). Mar 20 18:04:56.891004 sshd[3830]: Accepted publickey for core from 10.0.0.1 port 54862 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:04:56.893516 sshd-session[3830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:04:56.899416 systemd-logind[1485]: New session 10 of user core. Mar 20 18:04:56.905432 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 20 18:04:57.034560 sshd[3832]: Connection closed by 10.0.0.1 port 54862 Mar 20 18:04:57.034869 sshd-session[3830]: pam_unix(sshd:session): session closed for user core Mar 20 18:04:57.039108 systemd[1]: sshd@9-10.0.0.124:22-10.0.0.1:54862.service: Deactivated successfully. Mar 20 18:04:57.041075 systemd[1]: session-10.scope: Deactivated successfully. Mar 20 18:04:57.042263 systemd-logind[1485]: Session 10 logged out. Waiting for processes to exit. Mar 20 18:04:57.043356 systemd-logind[1485]: Removed session 10. Mar 20 18:04:57.290635 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3986262732.mount: Deactivated successfully. Mar 20 18:04:57.866654 containerd[1505]: time="2025-03-20T18:04:57.866593205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:57.867331 containerd[1505]: time="2025-03-20T18:04:57.867258938Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 20 18:04:57.868685 containerd[1505]: time="2025-03-20T18:04:57.868653583Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:57.870874 containerd[1505]: time="2025-03-20T18:04:57.870844527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:04:57.871433 containerd[1505]: time="2025-03-20T18:04:57.871386136Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 4.40401051s" Mar 20 18:04:57.871433 containerd[1505]: time="2025-03-20T18:04:57.871429848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 20 18:04:57.880551 containerd[1505]: time="2025-03-20T18:04:57.880477077Z" level=info msg="CreateContainer within sandbox \"26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 18:04:57.918540 containerd[1505]: time="2025-03-20T18:04:57.918479414Z" level=info msg="Container 433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:04:57.930660 containerd[1505]: time="2025-03-20T18:04:57.930610947Z" level=info msg="CreateContainer within sandbox \"26e8e8766c325c593528a9ca70c21c36f64def4a5ded62d8cb20563ee4d065c4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080\"" Mar 20 18:04:57.931254 containerd[1505]: time="2025-03-20T18:04:57.931211779Z" level=info msg="StartContainer for \"433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080\"" Mar 20 18:04:57.932836 containerd[1505]: time="2025-03-20T18:04:57.932767496Z" level=info msg="connecting to shim 433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080" address="unix:///run/containerd/s/067aa699b6852fe72d2ec2aa6cbe95bb25744097510f8df2beb0c5055be07e6a" protocol=ttrpc version=3 Mar 20 18:04:57.961428 systemd[1]: Started cri-containerd-433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080.scope - libcontainer container 433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080. Mar 20 18:04:58.106884 containerd[1505]: time="2025-03-20T18:04:58.106827406Z" level=info msg="StartContainer for \"433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080\" returns successfully" Mar 20 18:04:58.111872 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 20 18:04:58.111953 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 20 18:04:58.574524 containerd[1505]: time="2025-03-20T18:04:58.574473488Z" level=info msg="TaskExit event in podsandbox handler container_id:\"433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080\" id:\"d6bd6eb2c0a5acd96de0116a0dd269bfbd66556bbd6da6dfc81fe5e389c3332c\" pid:3923 exit_status:1 exited_at:{seconds:1742493898 nanos:574086559}" Mar 20 18:04:59.625329 kernel: bpftool[4084]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 20 18:04:59.640780 containerd[1505]: time="2025-03-20T18:04:59.640728415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080\" id:\"a7fd915aef686c33d37a67baa0d2fcb3707ad7499831c1910a9e948b9b680441\" pid:4045 exit_status:1 exited_at:{seconds:1742493899 nanos:640387573}" Mar 20 18:04:59.853697 systemd-networkd[1430]: vxlan.calico: Link UP Mar 20 18:04:59.853709 systemd-networkd[1430]: vxlan.calico: Gained carrier Mar 20 18:05:01.757507 systemd-networkd[1430]: vxlan.calico: Gained IPv6LL Mar 20 18:05:02.047227 systemd[1]: Started sshd@10-10.0.0.124:22-10.0.0.1:55818.service - OpenSSH per-connection server daemon (10.0.0.1:55818). Mar 20 18:05:02.103676 sshd[4164]: Accepted publickey for core from 10.0.0.1 port 55818 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:02.105173 sshd-session[4164]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:02.109498 systemd-logind[1485]: New session 11 of user core. Mar 20 18:05:02.120426 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 20 18:05:02.236254 sshd[4167]: Connection closed by 10.0.0.1 port 55818 Mar 20 18:05:02.236585 sshd-session[4164]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:02.253192 systemd[1]: sshd@10-10.0.0.124:22-10.0.0.1:55818.service: Deactivated successfully. Mar 20 18:05:02.255198 systemd[1]: session-11.scope: Deactivated successfully. Mar 20 18:05:02.256901 systemd-logind[1485]: Session 11 logged out. Waiting for processes to exit. Mar 20 18:05:02.258538 systemd[1]: Started sshd@11-10.0.0.124:22-10.0.0.1:55830.service - OpenSSH per-connection server daemon (10.0.0.1:55830). Mar 20 18:05:02.259733 systemd-logind[1485]: Removed session 11. Mar 20 18:05:02.312584 sshd[4180]: Accepted publickey for core from 10.0.0.1 port 55830 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:02.313894 sshd-session[4180]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:02.318058 systemd-logind[1485]: New session 12 of user core. Mar 20 18:05:02.325419 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 20 18:05:02.458560 sshd[4183]: Connection closed by 10.0.0.1 port 55830 Mar 20 18:05:02.458998 sshd-session[4180]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:02.469637 systemd[1]: sshd@11-10.0.0.124:22-10.0.0.1:55830.service: Deactivated successfully. Mar 20 18:05:02.472850 systemd[1]: session-12.scope: Deactivated successfully. Mar 20 18:05:02.475041 systemd-logind[1485]: Session 12 logged out. Waiting for processes to exit. Mar 20 18:05:02.481046 systemd[1]: Started sshd@12-10.0.0.124:22-10.0.0.1:55834.service - OpenSSH per-connection server daemon (10.0.0.1:55834). Mar 20 18:05:02.482167 systemd-logind[1485]: Removed session 12. Mar 20 18:05:02.525378 sshd[4193]: Accepted publickey for core from 10.0.0.1 port 55834 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:02.526971 sshd-session[4193]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:02.531471 systemd-logind[1485]: New session 13 of user core. Mar 20 18:05:02.540430 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 20 18:05:02.647481 sshd[4196]: Connection closed by 10.0.0.1 port 55834 Mar 20 18:05:02.647723 sshd-session[4193]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:02.652021 systemd[1]: sshd@12-10.0.0.124:22-10.0.0.1:55834.service: Deactivated successfully. Mar 20 18:05:02.654218 systemd[1]: session-13.scope: Deactivated successfully. Mar 20 18:05:02.654917 systemd-logind[1485]: Session 13 logged out. Waiting for processes to exit. Mar 20 18:05:02.655796 systemd-logind[1485]: Removed session 13. Mar 20 18:05:04.294813 containerd[1505]: time="2025-03-20T18:05:04.294693577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-h9gll,Uid:c4f851e4-8e7f-4611-a69f-58c7c3c807e5,Namespace:kube-system,Attempt:0,}" Mar 20 18:05:04.295346 containerd[1505]: time="2025-03-20T18:05:04.294693807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-xz74x,Uid:82be44a1-17f0-4e51-b6a3-0a32047afbeb,Namespace:calico-apiserver,Attempt:0,}" Mar 20 18:05:04.295346 containerd[1505]: time="2025-03-20T18:05:04.294693787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c598c664d-vl2f9,Uid:035053f4-6153-4569-a465-c4bae5aa605d,Namespace:calico-system,Attempt:0,}" Mar 20 18:05:04.443568 systemd-networkd[1430]: cali946e52f8ce7: Link UP Mar 20 18:05:04.444580 systemd-networkd[1430]: cali946e52f8ce7: Gained carrier Mar 20 18:05:04.455258 kubelet[2731]: I0320 18:05:04.454002 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hkdwt" podStartSLOduration=7.898037878 podStartE2EDuration="23.453964017s" podCreationTimestamp="2025-03-20 18:04:41 +0000 UTC" firstStartedPulling="2025-03-20 18:04:42.316161908 +0000 UTC m=+20.097327359" lastFinishedPulling="2025-03-20 18:04:57.872088047 +0000 UTC m=+35.653253498" observedRunningTime="2025-03-20 18:04:58.501029341 +0000 UTC m=+36.282194792" watchObservedRunningTime="2025-03-20 18:05:04.453964017 +0000 UTC m=+42.235129468" Mar 20 18:05:04.460880 containerd[1505]: 2025-03-20 18:05:04.340 [INFO][4208] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0 coredns-7db6d8ff4d- kube-system c4f851e4-8e7f-4611-a69f-58c7c3c807e5 762 0 2025-03-20 18:04:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-h9gll eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali946e52f8ce7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-" Mar 20 18:05:04.460880 containerd[1505]: 2025-03-20 18:05:04.341 [INFO][4208] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" Mar 20 18:05:04.460880 containerd[1505]: 2025-03-20 18:05:04.401 [INFO][4255] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" HandleID="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Workload="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.410 [INFO][4255] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" HandleID="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Workload="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00019acd0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-h9gll", "timestamp":"2025-03-20 18:05:04.401645571 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.410 [INFO][4255] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.410 [INFO][4255] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.410 [INFO][4255] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.412 [INFO][4255] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" host="localhost" Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.416 [INFO][4255] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.419 [INFO][4255] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.420 [INFO][4255] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.424 [INFO][4255] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:04.461282 containerd[1505]: 2025-03-20 18:05:04.424 [INFO][4255] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" host="localhost" Mar 20 18:05:04.461707 containerd[1505]: 2025-03-20 18:05:04.426 [INFO][4255] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad Mar 20 18:05:04.461707 containerd[1505]: 2025-03-20 18:05:04.431 [INFO][4255] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" host="localhost" Mar 20 18:05:04.461707 containerd[1505]: 2025-03-20 18:05:04.435 [INFO][4255] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" host="localhost" Mar 20 18:05:04.461707 containerd[1505]: 2025-03-20 18:05:04.435 [INFO][4255] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" host="localhost" Mar 20 18:05:04.461707 containerd[1505]: 2025-03-20 18:05:04.435 [INFO][4255] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 18:05:04.461707 containerd[1505]: 2025-03-20 18:05:04.435 [INFO][4255] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" HandleID="k8s-pod-network.213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Workload="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" Mar 20 18:05:04.461922 containerd[1505]: 2025-03-20 18:05:04.439 [INFO][4208] cni-plugin/k8s.go 386: Populated endpoint ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"c4f851e4-8e7f-4611-a69f-58c7c3c807e5", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-h9gll", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali946e52f8ce7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:04.462044 containerd[1505]: 2025-03-20 18:05:04.439 [INFO][4208] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" Mar 20 18:05:04.462044 containerd[1505]: 2025-03-20 18:05:04.439 [INFO][4208] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali946e52f8ce7 ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" Mar 20 18:05:04.462044 containerd[1505]: 2025-03-20 18:05:04.443 [INFO][4208] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" Mar 20 18:05:04.462344 containerd[1505]: 2025-03-20 18:05:04.444 [INFO][4208] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"c4f851e4-8e7f-4611-a69f-58c7c3c807e5", ResourceVersion:"762", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad", Pod:"coredns-7db6d8ff4d-h9gll", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali946e52f8ce7", MAC:"ba:ad:dc:56:10:61", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:04.462344 containerd[1505]: 2025-03-20 18:05:04.455 [INFO][4208] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" Namespace="kube-system" Pod="coredns-7db6d8ff4d-h9gll" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--h9gll-eth0" Mar 20 18:05:04.475730 systemd-networkd[1430]: cali9795f4ba96e: Link UP Mar 20 18:05:04.476144 systemd-networkd[1430]: cali9795f4ba96e: Gained carrier Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.340 [INFO][4219] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0 calico-apiserver-85cbf4758c- calico-apiserver 82be44a1-17f0-4e51-b6a3-0a32047afbeb 761 0 2025-03-20 18:04:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85cbf4758c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-85cbf4758c-xz74x eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9795f4ba96e [] []}} ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.341 [INFO][4219] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.403 [INFO][4253] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" HandleID="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Workload="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.410 [INFO][4253] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" HandleID="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Workload="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000305dc0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-85cbf4758c-xz74x", "timestamp":"2025-03-20 18:05:04.403657554 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.411 [INFO][4253] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.435 [INFO][4253] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.435 [INFO][4253] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.438 [INFO][4253] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.443 [INFO][4253] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.448 [INFO][4253] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.449 [INFO][4253] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.452 [INFO][4253] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.452 [INFO][4253] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.456 [INFO][4253] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60 Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.463 [INFO][4253] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.468 [INFO][4253] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.468 [INFO][4253] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" host="localhost" Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.468 [INFO][4253] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 18:05:04.490231 containerd[1505]: 2025-03-20 18:05:04.468 [INFO][4253] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" HandleID="k8s-pod-network.1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Workload="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" Mar 20 18:05:04.492273 containerd[1505]: 2025-03-20 18:05:04.472 [INFO][4219] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0", GenerateName:"calico-apiserver-85cbf4758c-", Namespace:"calico-apiserver", SelfLink:"", UID:"82be44a1-17f0-4e51-b6a3-0a32047afbeb", ResourceVersion:"761", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cbf4758c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-85cbf4758c-xz74x", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9795f4ba96e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:04.492273 containerd[1505]: 2025-03-20 18:05:04.472 [INFO][4219] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" Mar 20 18:05:04.492273 containerd[1505]: 2025-03-20 18:05:04.472 [INFO][4219] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9795f4ba96e ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" Mar 20 18:05:04.492273 containerd[1505]: 2025-03-20 18:05:04.474 [INFO][4219] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" Mar 20 18:05:04.492273 containerd[1505]: 2025-03-20 18:05:04.474 [INFO][4219] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0", GenerateName:"calico-apiserver-85cbf4758c-", Namespace:"calico-apiserver", SelfLink:"", UID:"82be44a1-17f0-4e51-b6a3-0a32047afbeb", ResourceVersion:"761", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cbf4758c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60", Pod:"calico-apiserver-85cbf4758c-xz74x", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9795f4ba96e", MAC:"12:61:e2:6d:a3:ad", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:04.492273 containerd[1505]: 2025-03-20 18:05:04.485 [INFO][4219] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-xz74x" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--xz74x-eth0" Mar 20 18:05:04.512530 systemd-networkd[1430]: calif2d1926cbc9: Link UP Mar 20 18:05:04.514387 systemd-networkd[1430]: calif2d1926cbc9: Gained carrier Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.340 [INFO][4233] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0 calico-kube-controllers-6c598c664d- calico-system 035053f4-6153-4569-a465-c4bae5aa605d 757 0 2025-03-20 18:04:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6c598c664d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6c598c664d-vl2f9 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif2d1926cbc9 [] []}} ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.341 [INFO][4233] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.401 [INFO][4257] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" HandleID="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Workload="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.410 [INFO][4257] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" HandleID="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Workload="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003af730), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6c598c664d-vl2f9", "timestamp":"2025-03-20 18:05:04.401645141 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.411 [INFO][4257] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.469 [INFO][4257] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.469 [INFO][4257] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.471 [INFO][4257] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.480 [INFO][4257] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.485 [INFO][4257] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.487 [INFO][4257] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.490 [INFO][4257] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.490 [INFO][4257] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.492 [INFO][4257] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.498 [INFO][4257] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.505 [INFO][4257] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.505 [INFO][4257] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" host="localhost" Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.505 [INFO][4257] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 18:05:04.527069 containerd[1505]: 2025-03-20 18:05:04.505 [INFO][4257] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" HandleID="k8s-pod-network.11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Workload="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" Mar 20 18:05:04.527729 containerd[1505]: 2025-03-20 18:05:04.509 [INFO][4233] cni-plugin/k8s.go 386: Populated endpoint ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0", GenerateName:"calico-kube-controllers-6c598c664d-", Namespace:"calico-system", SelfLink:"", UID:"035053f4-6153-4569-a465-c4bae5aa605d", ResourceVersion:"757", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c598c664d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6c598c664d-vl2f9", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif2d1926cbc9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:04.527729 containerd[1505]: 2025-03-20 18:05:04.509 [INFO][4233] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" Mar 20 18:05:04.527729 containerd[1505]: 2025-03-20 18:05:04.509 [INFO][4233] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif2d1926cbc9 ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" Mar 20 18:05:04.527729 containerd[1505]: 2025-03-20 18:05:04.512 [INFO][4233] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" Mar 20 18:05:04.527729 containerd[1505]: 2025-03-20 18:05:04.513 [INFO][4233] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0", GenerateName:"calico-kube-controllers-6c598c664d-", Namespace:"calico-system", SelfLink:"", UID:"035053f4-6153-4569-a465-c4bae5aa605d", ResourceVersion:"757", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c598c664d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a", Pod:"calico-kube-controllers-6c598c664d-vl2f9", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif2d1926cbc9", MAC:"8e:19:0d:a3:c3:a7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:04.527729 containerd[1505]: 2025-03-20 18:05:04.523 [INFO][4233] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" Namespace="calico-system" Pod="calico-kube-controllers-6c598c664d-vl2f9" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6c598c664d--vl2f9-eth0" Mar 20 18:05:04.602061 containerd[1505]: time="2025-03-20T18:05:04.601798108Z" level=info msg="connecting to shim 213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad" address="unix:///run/containerd/s/da6a751cb9d10856b5f7a1af4525b00d41adb388ebe7b1e1b93118db97c6ac1f" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:05:04.602779 containerd[1505]: time="2025-03-20T18:05:04.602443011Z" level=info msg="connecting to shim 11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a" address="unix:///run/containerd/s/fd010a398c1434f9ea0c9df180884c47b6b69824076bfdeeeb870cde95becffc" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:05:04.603041 containerd[1505]: time="2025-03-20T18:05:04.603009005Z" level=info msg="connecting to shim 1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60" address="unix:///run/containerd/s/08d6377da32f82669d327cc3a20e8085531bcc850a2babecb4fd960aabbffd82" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:05:04.661446 systemd[1]: Started cri-containerd-11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a.scope - libcontainer container 11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a. Mar 20 18:05:04.663262 systemd[1]: Started cri-containerd-1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60.scope - libcontainer container 1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60. Mar 20 18:05:04.666026 systemd[1]: Started cri-containerd-213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad.scope - libcontainer container 213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad. Mar 20 18:05:04.679012 systemd-resolved[1340]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 18:05:04.681799 systemd-resolved[1340]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 18:05:04.684579 systemd-resolved[1340]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 18:05:04.715544 containerd[1505]: time="2025-03-20T18:05:04.715269700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c598c664d-vl2f9,Uid:035053f4-6153-4569-a465-c4bae5aa605d,Namespace:calico-system,Attempt:0,} returns sandbox id \"11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a\"" Mar 20 18:05:04.718944 containerd[1505]: time="2025-03-20T18:05:04.718905926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 20 18:05:04.728599 containerd[1505]: time="2025-03-20T18:05:04.728570074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-xz74x,Uid:82be44a1-17f0-4e51-b6a3-0a32047afbeb,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60\"" Mar 20 18:05:04.730277 containerd[1505]: time="2025-03-20T18:05:04.730228041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-h9gll,Uid:c4f851e4-8e7f-4611-a69f-58c7c3c807e5,Namespace:kube-system,Attempt:0,} returns sandbox id \"213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad\"" Mar 20 18:05:04.732608 containerd[1505]: time="2025-03-20T18:05:04.732570675Z" level=info msg="CreateContainer within sandbox \"213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 18:05:04.744467 containerd[1505]: time="2025-03-20T18:05:04.744430301Z" level=info msg="Container 37866674a4a912fd6175b1bc1ccc994cc26931f3362c5c8c1e27e0cdaa65a401: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:05:04.755130 containerd[1505]: time="2025-03-20T18:05:04.755094047Z" level=info msg="CreateContainer within sandbox \"213e782f45976a336fb8efdf38ab46d47fe1abdb399d1550213467828c26e7ad\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"37866674a4a912fd6175b1bc1ccc994cc26931f3362c5c8c1e27e0cdaa65a401\"" Mar 20 18:05:04.755614 containerd[1505]: time="2025-03-20T18:05:04.755562338Z" level=info msg="StartContainer for \"37866674a4a912fd6175b1bc1ccc994cc26931f3362c5c8c1e27e0cdaa65a401\"" Mar 20 18:05:04.756325 containerd[1505]: time="2025-03-20T18:05:04.756284546Z" level=info msg="connecting to shim 37866674a4a912fd6175b1bc1ccc994cc26931f3362c5c8c1e27e0cdaa65a401" address="unix:///run/containerd/s/da6a751cb9d10856b5f7a1af4525b00d41adb388ebe7b1e1b93118db97c6ac1f" protocol=ttrpc version=3 Mar 20 18:05:04.776447 systemd[1]: Started cri-containerd-37866674a4a912fd6175b1bc1ccc994cc26931f3362c5c8c1e27e0cdaa65a401.scope - libcontainer container 37866674a4a912fd6175b1bc1ccc994cc26931f3362c5c8c1e27e0cdaa65a401. Mar 20 18:05:04.809245 containerd[1505]: time="2025-03-20T18:05:04.809200055Z" level=info msg="StartContainer for \"37866674a4a912fd6175b1bc1ccc994cc26931f3362c5c8c1e27e0cdaa65a401\" returns successfully" Mar 20 18:05:05.512680 kubelet[2731]: I0320 18:05:05.512546 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-h9gll" podStartSLOduration=29.512525069 podStartE2EDuration="29.512525069s" podCreationTimestamp="2025-03-20 18:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 18:05:05.51059456 +0000 UTC m=+43.291760031" watchObservedRunningTime="2025-03-20 18:05:05.512525069 +0000 UTC m=+43.293690520" Mar 20 18:05:05.725494 systemd-networkd[1430]: cali946e52f8ce7: Gained IPv6LL Mar 20 18:05:05.790432 systemd-networkd[1430]: cali9795f4ba96e: Gained IPv6LL Mar 20 18:05:06.294422 containerd[1505]: time="2025-03-20T18:05:06.294378119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-r2f89,Uid:1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1,Namespace:kube-system,Attempt:0,}" Mar 20 18:05:06.366960 systemd-networkd[1430]: calif2d1926cbc9: Gained IPv6LL Mar 20 18:05:06.438844 systemd-networkd[1430]: cali7c4e455711f: Link UP Mar 20 18:05:06.439039 systemd-networkd[1430]: cali7c4e455711f: Gained carrier Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.364 [INFO][4505] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0 coredns-7db6d8ff4d- kube-system 1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1 763 0 2025-03-20 18:04:36 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-r2f89 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7c4e455711f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.364 [INFO][4505] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.393 [INFO][4519] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" HandleID="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Workload="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.401 [INFO][4519] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" HandleID="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Workload="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ddca0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-r2f89", "timestamp":"2025-03-20 18:05:06.393100255 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.401 [INFO][4519] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.401 [INFO][4519] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.401 [INFO][4519] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.403 [INFO][4519] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.406 [INFO][4519] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.410 [INFO][4519] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.411 [INFO][4519] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.413 [INFO][4519] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.413 [INFO][4519] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.414 [INFO][4519] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.417 [INFO][4519] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.434 [INFO][4519] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.434 [INFO][4519] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" host="localhost" Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.434 [INFO][4519] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 18:05:06.526361 containerd[1505]: 2025-03-20 18:05:06.434 [INFO][4519] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" HandleID="k8s-pod-network.1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Workload="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" Mar 20 18:05:06.526986 containerd[1505]: 2025-03-20 18:05:06.436 [INFO][4505] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-r2f89", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7c4e455711f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:06.526986 containerd[1505]: 2025-03-20 18:05:06.437 [INFO][4505] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" Mar 20 18:05:06.526986 containerd[1505]: 2025-03-20 18:05:06.437 [INFO][4505] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7c4e455711f ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" Mar 20 18:05:06.526986 containerd[1505]: 2025-03-20 18:05:06.439 [INFO][4505] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" Mar 20 18:05:06.526986 containerd[1505]: 2025-03-20 18:05:06.439 [INFO][4505] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1", ResourceVersion:"763", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c", Pod:"coredns-7db6d8ff4d-r2f89", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7c4e455711f", MAC:"ba:4d:25:6d:01:0c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:06.526986 containerd[1505]: 2025-03-20 18:05:06.520 [INFO][4505] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-r2f89" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--r2f89-eth0" Mar 20 18:05:06.946202 containerd[1505]: time="2025-03-20T18:05:06.946159510Z" level=info msg="connecting to shim 1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c" address="unix:///run/containerd/s/260dc25eab8bdc672bafe5689323923f23670765ad675df5cb302a099b8de2e1" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:05:06.971426 systemd[1]: Started cri-containerd-1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c.scope - libcontainer container 1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c. Mar 20 18:05:06.983512 systemd-resolved[1340]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 18:05:07.030249 containerd[1505]: time="2025-03-20T18:05:07.030203242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-r2f89,Uid:1c8b2dde-8013-4cf0-a170-5ffd9e11c2e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c\"" Mar 20 18:05:07.033091 containerd[1505]: time="2025-03-20T18:05:07.033053268Z" level=info msg="CreateContainer within sandbox \"1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 18:05:07.132994 containerd[1505]: time="2025-03-20T18:05:07.132926647Z" level=info msg="Container 37acfc57118219fefbdfd4e9fc95e6d51f703c8c264731969dac83ff42284030: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:05:07.142419 containerd[1505]: time="2025-03-20T18:05:07.142372189Z" level=info msg="CreateContainer within sandbox \"1d4ab6fb4ff7524afa3ec4b93416197f393cc94b9f7b7813ac02965c4ce3c97c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"37acfc57118219fefbdfd4e9fc95e6d51f703c8c264731969dac83ff42284030\"" Mar 20 18:05:07.149002 containerd[1505]: time="2025-03-20T18:05:07.148960149Z" level=info msg="StartContainer for \"37acfc57118219fefbdfd4e9fc95e6d51f703c8c264731969dac83ff42284030\"" Mar 20 18:05:07.149934 containerd[1505]: time="2025-03-20T18:05:07.149879898Z" level=info msg="connecting to shim 37acfc57118219fefbdfd4e9fc95e6d51f703c8c264731969dac83ff42284030" address="unix:///run/containerd/s/260dc25eab8bdc672bafe5689323923f23670765ad675df5cb302a099b8de2e1" protocol=ttrpc version=3 Mar 20 18:05:07.175524 systemd[1]: Started cri-containerd-37acfc57118219fefbdfd4e9fc95e6d51f703c8c264731969dac83ff42284030.scope - libcontainer container 37acfc57118219fefbdfd4e9fc95e6d51f703c8c264731969dac83ff42284030. Mar 20 18:05:07.208274 containerd[1505]: time="2025-03-20T18:05:07.208029286Z" level=info msg="StartContainer for \"37acfc57118219fefbdfd4e9fc95e6d51f703c8c264731969dac83ff42284030\" returns successfully" Mar 20 18:05:07.222623 containerd[1505]: time="2025-03-20T18:05:07.222576333Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:07.223803 containerd[1505]: time="2025-03-20T18:05:07.223756481Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 20 18:05:07.224955 containerd[1505]: time="2025-03-20T18:05:07.224930898Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:07.227261 containerd[1505]: time="2025-03-20T18:05:07.227221923Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:07.228208 containerd[1505]: time="2025-03-20T18:05:07.228173483Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 2.509230265s" Mar 20 18:05:07.228208 containerd[1505]: time="2025-03-20T18:05:07.228203309Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 20 18:05:07.229110 containerd[1505]: time="2025-03-20T18:05:07.229079334Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 18:05:07.236636 containerd[1505]: time="2025-03-20T18:05:07.235895143Z" level=info msg="CreateContainer within sandbox \"11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 18:05:07.244827 containerd[1505]: time="2025-03-20T18:05:07.244804006Z" level=info msg="Container 520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:05:07.251666 containerd[1505]: time="2025-03-20T18:05:07.251630344Z" level=info msg="CreateContainer within sandbox \"11d9b8c306a62b49f526bcafd126045f544bb14742b113721e8d885413c8168a\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea\"" Mar 20 18:05:07.252068 containerd[1505]: time="2025-03-20T18:05:07.252029444Z" level=info msg="StartContainer for \"520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea\"" Mar 20 18:05:07.252946 containerd[1505]: time="2025-03-20T18:05:07.252924706Z" level=info msg="connecting to shim 520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea" address="unix:///run/containerd/s/fd010a398c1434f9ea0c9df180884c47b6b69824076bfdeeeb870cde95becffc" protocol=ttrpc version=3 Mar 20 18:05:07.277436 systemd[1]: Started cri-containerd-520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea.scope - libcontainer container 520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea. Mar 20 18:05:07.293682 containerd[1505]: time="2025-03-20T18:05:07.293644129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-jbp62,Uid:c0d15fad-9029-4549-86c7-afeb3307ce00,Namespace:calico-apiserver,Attempt:0,}" Mar 20 18:05:07.478672 containerd[1505]: time="2025-03-20T18:05:07.478563493Z" level=info msg="StartContainer for \"520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea\" returns successfully" Mar 20 18:05:07.500104 systemd-networkd[1430]: cali51a13299f2c: Link UP Mar 20 18:05:07.500373 systemd-networkd[1430]: cali51a13299f2c: Gained carrier Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.339 [INFO][4650] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0 calico-apiserver-85cbf4758c- calico-apiserver c0d15fad-9029-4549-86c7-afeb3307ce00 760 0 2025-03-20 18:04:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85cbf4758c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-85cbf4758c-jbp62 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali51a13299f2c [] []}} ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.358 [INFO][4650] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.439 [INFO][4675] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" HandleID="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Workload="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.446 [INFO][4675] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" HandleID="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Workload="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00038d4f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-85cbf4758c-jbp62", "timestamp":"2025-03-20 18:05:07.439237438 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.446 [INFO][4675] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.446 [INFO][4675] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.446 [INFO][4675] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.448 [INFO][4675] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.468 [INFO][4675] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.473 [INFO][4675] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.475 [INFO][4675] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.478 [INFO][4675] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.478 [INFO][4675] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.482 [INFO][4675] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.490 [INFO][4675] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.495 [INFO][4675] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.495 [INFO][4675] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" host="localhost" Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.495 [INFO][4675] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 18:05:07.516360 containerd[1505]: 2025-03-20 18:05:07.495 [INFO][4675] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" HandleID="k8s-pod-network.e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Workload="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" Mar 20 18:05:07.516926 containerd[1505]: 2025-03-20 18:05:07.498 [INFO][4650] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0", GenerateName:"calico-apiserver-85cbf4758c-", Namespace:"calico-apiserver", SelfLink:"", UID:"c0d15fad-9029-4549-86c7-afeb3307ce00", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cbf4758c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-85cbf4758c-jbp62", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali51a13299f2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:07.516926 containerd[1505]: 2025-03-20 18:05:07.498 [INFO][4650] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" Mar 20 18:05:07.516926 containerd[1505]: 2025-03-20 18:05:07.498 [INFO][4650] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali51a13299f2c ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" Mar 20 18:05:07.516926 containerd[1505]: 2025-03-20 18:05:07.500 [INFO][4650] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" Mar 20 18:05:07.516926 containerd[1505]: 2025-03-20 18:05:07.500 [INFO][4650] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0", GenerateName:"calico-apiserver-85cbf4758c-", Namespace:"calico-apiserver", SelfLink:"", UID:"c0d15fad-9029-4549-86c7-afeb3307ce00", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85cbf4758c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad", Pod:"calico-apiserver-85cbf4758c-jbp62", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali51a13299f2c", MAC:"ce:55:b2:d2:7c:34", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:07.516926 containerd[1505]: 2025-03-20 18:05:07.508 [INFO][4650] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" Namespace="calico-apiserver" Pod="calico-apiserver-85cbf4758c-jbp62" WorkloadEndpoint="localhost-k8s-calico--apiserver--85cbf4758c--jbp62-eth0" Mar 20 18:05:07.550955 kubelet[2731]: I0320 18:05:07.550156 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-r2f89" podStartSLOduration=31.550140134 podStartE2EDuration="31.550140134s" podCreationTimestamp="2025-03-20 18:04:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 18:05:07.549058772 +0000 UTC m=+45.330224233" watchObservedRunningTime="2025-03-20 18:05:07.550140134 +0000 UTC m=+45.331305585" Mar 20 18:05:07.550955 kubelet[2731]: I0320 18:05:07.550281 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6c598c664d-vl2f9" podStartSLOduration=23.039546575 podStartE2EDuration="25.55027608s" podCreationTimestamp="2025-03-20 18:04:42 +0000 UTC" firstStartedPulling="2025-03-20 18:05:04.718178428 +0000 UTC m=+42.499343879" lastFinishedPulling="2025-03-20 18:05:07.228907932 +0000 UTC m=+45.010073384" observedRunningTime="2025-03-20 18:05:07.522441631 +0000 UTC m=+45.303607082" watchObservedRunningTime="2025-03-20 18:05:07.55027608 +0000 UTC m=+45.331441531" Mar 20 18:05:07.566216 containerd[1505]: time="2025-03-20T18:05:07.566079238Z" level=info msg="TaskExit event in podsandbox handler container_id:\"520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea\" id:\"1401eed770bfe313c848b1117c05b30371228731e7a5ff40f41a978bad7ec37e\" pid:4712 exited_at:{seconds:1742493907 nanos:563408840}" Mar 20 18:05:07.573612 containerd[1505]: time="2025-03-20T18:05:07.573556590Z" level=info msg="connecting to shim e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad" address="unix:///run/containerd/s/6566351fa4a803d34eb97990af74113a08d6305ec2ec81b45158a07365686da6" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:05:07.601578 systemd[1]: Started cri-containerd-e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad.scope - libcontainer container e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad. Mar 20 18:05:07.619132 systemd-resolved[1340]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 18:05:07.659757 containerd[1505]: time="2025-03-20T18:05:07.659723971Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85cbf4758c-jbp62,Uid:c0d15fad-9029-4549-86c7-afeb3307ce00,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad\"" Mar 20 18:05:07.662625 systemd[1]: Started sshd@13-10.0.0.124:22-10.0.0.1:55844.service - OpenSSH per-connection server daemon (10.0.0.1:55844). Mar 20 18:05:07.721490 sshd[4775]: Accepted publickey for core from 10.0.0.1 port 55844 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:07.723060 sshd-session[4775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:07.727195 systemd-logind[1485]: New session 14 of user core. Mar 20 18:05:07.740429 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 20 18:05:07.859089 sshd[4777]: Connection closed by 10.0.0.1 port 55844 Mar 20 18:05:07.859408 sshd-session[4775]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:07.862897 systemd[1]: sshd@13-10.0.0.124:22-10.0.0.1:55844.service: Deactivated successfully. Mar 20 18:05:07.864746 systemd[1]: session-14.scope: Deactivated successfully. Mar 20 18:05:07.865498 systemd-logind[1485]: Session 14 logged out. Waiting for processes to exit. Mar 20 18:05:07.866361 systemd-logind[1485]: Removed session 14. Mar 20 18:05:08.157541 systemd-networkd[1430]: cali7c4e455711f: Gained IPv6LL Mar 20 18:05:08.294025 containerd[1505]: time="2025-03-20T18:05:08.293982626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qn6tp,Uid:677f2c34-5e47-440b-986e-f493c61c5494,Namespace:calico-system,Attempt:0,}" Mar 20 18:05:08.405958 systemd-networkd[1430]: cali59cf72b9bd1: Link UP Mar 20 18:05:08.406684 systemd-networkd[1430]: cali59cf72b9bd1: Gained carrier Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.335 [INFO][4790] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--qn6tp-eth0 csi-node-driver- calico-system 677f2c34-5e47-440b-986e-f493c61c5494 621 0 2025-03-20 18:04:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-qn6tp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali59cf72b9bd1 [] []}} ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.335 [INFO][4790] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-eth0" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.372 [INFO][4804] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" HandleID="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Workload="localhost-k8s-csi--node--driver--qn6tp-eth0" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.379 [INFO][4804] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" HandleID="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Workload="localhost-k8s-csi--node--driver--qn6tp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051d90), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-qn6tp", "timestamp":"2025-03-20 18:05:08.37294422 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.379 [INFO][4804] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.379 [INFO][4804] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.380 [INFO][4804] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.381 [INFO][4804] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.384 [INFO][4804] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.388 [INFO][4804] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.389 [INFO][4804] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.391 [INFO][4804] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.391 [INFO][4804] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.392 [INFO][4804] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.395 [INFO][4804] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.400 [INFO][4804] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.400 [INFO][4804] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" host="localhost" Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.400 [INFO][4804] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 18:05:08.430327 containerd[1505]: 2025-03-20 18:05:08.400 [INFO][4804] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" HandleID="k8s-pod-network.89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Workload="localhost-k8s-csi--node--driver--qn6tp-eth0" Mar 20 18:05:08.431143 containerd[1505]: 2025-03-20 18:05:08.404 [INFO][4790] cni-plugin/k8s.go 386: Populated endpoint ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--qn6tp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"677f2c34-5e47-440b-986e-f493c61c5494", ResourceVersion:"621", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-qn6tp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali59cf72b9bd1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:08.431143 containerd[1505]: 2025-03-20 18:05:08.404 [INFO][4790] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-eth0" Mar 20 18:05:08.431143 containerd[1505]: 2025-03-20 18:05:08.404 [INFO][4790] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali59cf72b9bd1 ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-eth0" Mar 20 18:05:08.431143 containerd[1505]: 2025-03-20 18:05:08.406 [INFO][4790] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-eth0" Mar 20 18:05:08.431143 containerd[1505]: 2025-03-20 18:05:08.406 [INFO][4790] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--qn6tp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"677f2c34-5e47-440b-986e-f493c61c5494", ResourceVersion:"621", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 18, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d", Pod:"csi-node-driver-qn6tp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali59cf72b9bd1", MAC:"a6:74:f2:27:4e:a2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 18:05:08.431143 containerd[1505]: 2025-03-20 18:05:08.426 [INFO][4790] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" Namespace="calico-system" Pod="csi-node-driver-qn6tp" WorkloadEndpoint="localhost-k8s-csi--node--driver--qn6tp-eth0" Mar 20 18:05:08.485547 containerd[1505]: time="2025-03-20T18:05:08.485500692Z" level=info msg="connecting to shim 89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d" address="unix:///run/containerd/s/14da0fc75e25219974cc166d13bd9020bbcfd66a374bf3e89d555d5999cf15a4" namespace=k8s.io protocol=ttrpc version=3 Mar 20 18:05:08.509605 systemd[1]: Started cri-containerd-89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d.scope - libcontainer container 89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d. Mar 20 18:05:08.521767 systemd-resolved[1340]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 20 18:05:08.534749 containerd[1505]: time="2025-03-20T18:05:08.534702395Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qn6tp,Uid:677f2c34-5e47-440b-986e-f493c61c5494,Namespace:calico-system,Attempt:0,} returns sandbox id \"89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d\"" Mar 20 18:05:09.309452 systemd-networkd[1430]: cali51a13299f2c: Gained IPv6LL Mar 20 18:05:10.127767 containerd[1505]: time="2025-03-20T18:05:10.127715193Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:10.128727 containerd[1505]: time="2025-03-20T18:05:10.128655169Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 20 18:05:10.130035 containerd[1505]: time="2025-03-20T18:05:10.130002650Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:10.132131 containerd[1505]: time="2025-03-20T18:05:10.132090773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:10.132609 containerd[1505]: time="2025-03-20T18:05:10.132581064Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 2.903375643s" Mar 20 18:05:10.132609 containerd[1505]: time="2025-03-20T18:05:10.132608446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 18:05:10.133655 containerd[1505]: time="2025-03-20T18:05:10.133547430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 18:05:10.135141 containerd[1505]: time="2025-03-20T18:05:10.135102682Z" level=info msg="CreateContainer within sandbox \"1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 18:05:10.141499 systemd-networkd[1430]: cali59cf72b9bd1: Gained IPv6LL Mar 20 18:05:10.145320 containerd[1505]: time="2025-03-20T18:05:10.143328044Z" level=info msg="Container 25f0019f3a28b2a4468b42d2a25248cede4bc5e80f59918b4743d126bf41abee: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:05:10.152914 containerd[1505]: time="2025-03-20T18:05:10.152886423Z" level=info msg="CreateContainer within sandbox \"1d2261107b1b481b36fbfe9aaf8870e02dc772ae86388aa483413ae522272b60\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"25f0019f3a28b2a4468b42d2a25248cede4bc5e80f59918b4743d126bf41abee\"" Mar 20 18:05:10.153670 containerd[1505]: time="2025-03-20T18:05:10.153554677Z" level=info msg="StartContainer for \"25f0019f3a28b2a4468b42d2a25248cede4bc5e80f59918b4743d126bf41abee\"" Mar 20 18:05:10.154470 containerd[1505]: time="2025-03-20T18:05:10.154451934Z" level=info msg="connecting to shim 25f0019f3a28b2a4468b42d2a25248cede4bc5e80f59918b4743d126bf41abee" address="unix:///run/containerd/s/08d6377da32f82669d327cc3a20e8085531bcc850a2babecb4fd960aabbffd82" protocol=ttrpc version=3 Mar 20 18:05:10.180559 systemd[1]: Started cri-containerd-25f0019f3a28b2a4468b42d2a25248cede4bc5e80f59918b4743d126bf41abee.scope - libcontainer container 25f0019f3a28b2a4468b42d2a25248cede4bc5e80f59918b4743d126bf41abee. Mar 20 18:05:10.227056 containerd[1505]: time="2025-03-20T18:05:10.227013254Z" level=info msg="StartContainer for \"25f0019f3a28b2a4468b42d2a25248cede4bc5e80f59918b4743d126bf41abee\" returns successfully" Mar 20 18:05:10.534123 kubelet[2731]: I0320 18:05:10.533877 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85cbf4758c-xz74x" podStartSLOduration=24.129959057 podStartE2EDuration="29.533855521s" podCreationTimestamp="2025-03-20 18:04:41 +0000 UTC" firstStartedPulling="2025-03-20 18:05:04.729425262 +0000 UTC m=+42.510590713" lastFinishedPulling="2025-03-20 18:05:10.133321726 +0000 UTC m=+47.914487177" observedRunningTime="2025-03-20 18:05:10.533515101 +0000 UTC m=+48.314680562" watchObservedRunningTime="2025-03-20 18:05:10.533855521 +0000 UTC m=+48.315020982" Mar 20 18:05:10.603942 containerd[1505]: time="2025-03-20T18:05:10.603883072Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:10.604759 containerd[1505]: time="2025-03-20T18:05:10.604702401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 20 18:05:10.606774 containerd[1505]: time="2025-03-20T18:05:10.606727036Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 473.133439ms" Mar 20 18:05:10.606774 containerd[1505]: time="2025-03-20T18:05:10.606759807Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 18:05:10.608448 containerd[1505]: time="2025-03-20T18:05:10.608411420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 20 18:05:10.609678 containerd[1505]: time="2025-03-20T18:05:10.609650557Z" level=info msg="CreateContainer within sandbox \"e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 18:05:10.619327 containerd[1505]: time="2025-03-20T18:05:10.619111071Z" level=info msg="Container 10e5974d0c9d87830cc3ab8f530be42dc4d35be8669ef016720d21ba8fbdbb01: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:05:10.628926 containerd[1505]: time="2025-03-20T18:05:10.628880405Z" level=info msg="CreateContainer within sandbox \"e1401b05b0cea058ceb6f2425e251630f141722666bbe079b8a72ac0a10c13ad\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"10e5974d0c9d87830cc3ab8f530be42dc4d35be8669ef016720d21ba8fbdbb01\"" Mar 20 18:05:10.629690 containerd[1505]: time="2025-03-20T18:05:10.629661392Z" level=info msg="StartContainer for \"10e5974d0c9d87830cc3ab8f530be42dc4d35be8669ef016720d21ba8fbdbb01\"" Mar 20 18:05:10.630654 containerd[1505]: time="2025-03-20T18:05:10.630626245Z" level=info msg="connecting to shim 10e5974d0c9d87830cc3ab8f530be42dc4d35be8669ef016720d21ba8fbdbb01" address="unix:///run/containerd/s/6566351fa4a803d34eb97990af74113a08d6305ec2ec81b45158a07365686da6" protocol=ttrpc version=3 Mar 20 18:05:10.654458 systemd[1]: Started cri-containerd-10e5974d0c9d87830cc3ab8f530be42dc4d35be8669ef016720d21ba8fbdbb01.scope - libcontainer container 10e5974d0c9d87830cc3ab8f530be42dc4d35be8669ef016720d21ba8fbdbb01. Mar 20 18:05:10.701715 containerd[1505]: time="2025-03-20T18:05:10.701651751Z" level=info msg="StartContainer for \"10e5974d0c9d87830cc3ab8f530be42dc4d35be8669ef016720d21ba8fbdbb01\" returns successfully" Mar 20 18:05:11.541475 kubelet[2731]: I0320 18:05:11.541292 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85cbf4758c-jbp62" podStartSLOduration=27.594796689 podStartE2EDuration="30.541275187s" podCreationTimestamp="2025-03-20 18:04:41 +0000 UTC" firstStartedPulling="2025-03-20 18:05:07.66106976 +0000 UTC m=+45.442235211" lastFinishedPulling="2025-03-20 18:05:10.607548258 +0000 UTC m=+48.388713709" observedRunningTime="2025-03-20 18:05:11.541001323 +0000 UTC m=+49.322166774" watchObservedRunningTime="2025-03-20 18:05:11.541275187 +0000 UTC m=+49.322440638" Mar 20 18:05:12.309895 containerd[1505]: time="2025-03-20T18:05:12.309847564Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:12.311434 containerd[1505]: time="2025-03-20T18:05:12.311377238Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 20 18:05:12.312580 containerd[1505]: time="2025-03-20T18:05:12.312553478Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:12.314335 containerd[1505]: time="2025-03-20T18:05:12.314295209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:12.314911 containerd[1505]: time="2025-03-20T18:05:12.314887842Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 1.706437109s" Mar 20 18:05:12.314960 containerd[1505]: time="2025-03-20T18:05:12.314915825Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 20 18:05:12.316781 containerd[1505]: time="2025-03-20T18:05:12.316755600Z" level=info msg="CreateContainer within sandbox \"89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 20 18:05:12.336135 containerd[1505]: time="2025-03-20T18:05:12.336070081Z" level=info msg="Container 32651458f61954515471fb401cb6c8d2ac41cff3f0bd5cafa24bd08545663a21: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:05:12.345671 containerd[1505]: time="2025-03-20T18:05:12.345621071Z" level=info msg="CreateContainer within sandbox \"89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"32651458f61954515471fb401cb6c8d2ac41cff3f0bd5cafa24bd08545663a21\"" Mar 20 18:05:12.345934 containerd[1505]: time="2025-03-20T18:05:12.345910244Z" level=info msg="StartContainer for \"32651458f61954515471fb401cb6c8d2ac41cff3f0bd5cafa24bd08545663a21\"" Mar 20 18:05:12.347226 containerd[1505]: time="2025-03-20T18:05:12.347203433Z" level=info msg="connecting to shim 32651458f61954515471fb401cb6c8d2ac41cff3f0bd5cafa24bd08545663a21" address="unix:///run/containerd/s/14da0fc75e25219974cc166d13bd9020bbcfd66a374bf3e89d555d5999cf15a4" protocol=ttrpc version=3 Mar 20 18:05:12.374443 systemd[1]: Started cri-containerd-32651458f61954515471fb401cb6c8d2ac41cff3f0bd5cafa24bd08545663a21.scope - libcontainer container 32651458f61954515471fb401cb6c8d2ac41cff3f0bd5cafa24bd08545663a21. Mar 20 18:05:12.415939 containerd[1505]: time="2025-03-20T18:05:12.415883791Z" level=info msg="StartContainer for \"32651458f61954515471fb401cb6c8d2ac41cff3f0bd5cafa24bd08545663a21\" returns successfully" Mar 20 18:05:12.417157 containerd[1505]: time="2025-03-20T18:05:12.417118210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 20 18:05:12.874396 systemd[1]: Started sshd@14-10.0.0.124:22-10.0.0.1:59962.service - OpenSSH per-connection server daemon (10.0.0.1:59962). Mar 20 18:05:13.038146 sshd[5001]: Accepted publickey for core from 10.0.0.1 port 59962 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:13.039709 sshd-session[5001]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:13.044018 systemd-logind[1485]: New session 15 of user core. Mar 20 18:05:13.053580 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 20 18:05:13.173485 sshd[5003]: Connection closed by 10.0.0.1 port 59962 Mar 20 18:05:13.173731 sshd-session[5001]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:13.177616 systemd[1]: sshd@14-10.0.0.124:22-10.0.0.1:59962.service: Deactivated successfully. Mar 20 18:05:13.180029 systemd[1]: session-15.scope: Deactivated successfully. Mar 20 18:05:13.180758 systemd-logind[1485]: Session 15 logged out. Waiting for processes to exit. Mar 20 18:05:13.181584 systemd-logind[1485]: Removed session 15. Mar 20 18:05:14.496125 containerd[1505]: time="2025-03-20T18:05:14.495899305Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:14.496806 containerd[1505]: time="2025-03-20T18:05:14.496733962Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 20 18:05:14.498489 containerd[1505]: time="2025-03-20T18:05:14.498446528Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:14.500669 containerd[1505]: time="2025-03-20T18:05:14.500631092Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 18:05:14.501225 containerd[1505]: time="2025-03-20T18:05:14.501178440Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.084017579s" Mar 20 18:05:14.501225 containerd[1505]: time="2025-03-20T18:05:14.501220669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 20 18:05:14.503278 containerd[1505]: time="2025-03-20T18:05:14.503242205Z" level=info msg="CreateContainer within sandbox \"89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 20 18:05:14.515345 containerd[1505]: time="2025-03-20T18:05:14.515267751Z" level=info msg="Container 8391e1fc3c83e4767b9cf891e40443e02050d2b98c2548e40932d6c876a5a248: CDI devices from CRI Config.CDIDevices: []" Mar 20 18:05:14.527477 containerd[1505]: time="2025-03-20T18:05:14.527425193Z" level=info msg="CreateContainer within sandbox \"89bbadd38fb5fc0f9a2cff15211904de8d96f250d85e7f33fc02ec5e272c836d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"8391e1fc3c83e4767b9cf891e40443e02050d2b98c2548e40932d6c876a5a248\"" Mar 20 18:05:14.528363 containerd[1505]: time="2025-03-20T18:05:14.528283335Z" level=info msg="StartContainer for \"8391e1fc3c83e4767b9cf891e40443e02050d2b98c2548e40932d6c876a5a248\"" Mar 20 18:05:14.530403 containerd[1505]: time="2025-03-20T18:05:14.530002204Z" level=info msg="connecting to shim 8391e1fc3c83e4767b9cf891e40443e02050d2b98c2548e40932d6c876a5a248" address="unix:///run/containerd/s/14da0fc75e25219974cc166d13bd9020bbcfd66a374bf3e89d555d5999cf15a4" protocol=ttrpc version=3 Mar 20 18:05:14.562642 systemd[1]: Started cri-containerd-8391e1fc3c83e4767b9cf891e40443e02050d2b98c2548e40932d6c876a5a248.scope - libcontainer container 8391e1fc3c83e4767b9cf891e40443e02050d2b98c2548e40932d6c876a5a248. Mar 20 18:05:14.610762 containerd[1505]: time="2025-03-20T18:05:14.610721573Z" level=info msg="StartContainer for \"8391e1fc3c83e4767b9cf891e40443e02050d2b98c2548e40932d6c876a5a248\" returns successfully" Mar 20 18:05:15.467628 kubelet[2731]: I0320 18:05:15.467580 2731 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 20 18:05:15.467628 kubelet[2731]: I0320 18:05:15.467621 2731 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 20 18:05:15.557515 kubelet[2731]: I0320 18:05:15.557206 2731 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-qn6tp" podStartSLOduration=28.592953125 podStartE2EDuration="34.557188739s" podCreationTimestamp="2025-03-20 18:04:41 +0000 UTC" firstStartedPulling="2025-03-20 18:05:08.537782643 +0000 UTC m=+46.318948094" lastFinishedPulling="2025-03-20 18:05:14.502018257 +0000 UTC m=+52.283183708" observedRunningTime="2025-03-20 18:05:15.55625721 +0000 UTC m=+53.337422661" watchObservedRunningTime="2025-03-20 18:05:15.557188739 +0000 UTC m=+53.338354190" Mar 20 18:05:18.187467 systemd[1]: Started sshd@15-10.0.0.124:22-10.0.0.1:59970.service - OpenSSH per-connection server daemon (10.0.0.1:59970). Mar 20 18:05:18.242830 sshd[5055]: Accepted publickey for core from 10.0.0.1 port 59970 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:18.244539 sshd-session[5055]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:18.248586 systemd-logind[1485]: New session 16 of user core. Mar 20 18:05:18.261446 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 20 18:05:18.380581 sshd[5057]: Connection closed by 10.0.0.1 port 59970 Mar 20 18:05:18.380989 sshd-session[5055]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:18.385752 systemd[1]: sshd@15-10.0.0.124:22-10.0.0.1:59970.service: Deactivated successfully. Mar 20 18:05:18.387841 systemd[1]: session-16.scope: Deactivated successfully. Mar 20 18:05:18.388512 systemd-logind[1485]: Session 16 logged out. Waiting for processes to exit. Mar 20 18:05:18.389315 systemd-logind[1485]: Removed session 16. Mar 20 18:05:22.181471 containerd[1505]: time="2025-03-20T18:05:22.181393565Z" level=info msg="TaskExit event in podsandbox handler container_id:\"433069f7d5902ce3d38e68e612139b71b28fbc055f4a73bd5cca5e50339a9080\" id:\"2f9c5f9a795f130e02b8f52d7a81ec849d5d844e0c7d4b30d752ada7fbf12403\" pid:5089 exited_at:{seconds:1742493922 nanos:180907913}" Mar 20 18:05:23.149279 containerd[1505]: time="2025-03-20T18:05:23.149184183Z" level=info msg="TaskExit event in podsandbox handler container_id:\"520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea\" id:\"d1983ba2f22a0610b7e4596af4a98b022af366b087c10bef2c1bd67e2baa451b\" pid:5115 exited_at:{seconds:1742493923 nanos:148926560}" Mar 20 18:05:23.393129 systemd[1]: Started sshd@16-10.0.0.124:22-10.0.0.1:35808.service - OpenSSH per-connection server daemon (10.0.0.1:35808). Mar 20 18:05:23.458960 sshd[5126]: Accepted publickey for core from 10.0.0.1 port 35808 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:23.460569 sshd-session[5126]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:23.465203 systemd-logind[1485]: New session 17 of user core. Mar 20 18:05:23.475427 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 20 18:05:23.594017 sshd[5128]: Connection closed by 10.0.0.1 port 35808 Mar 20 18:05:23.594505 sshd-session[5126]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:23.607996 systemd[1]: sshd@16-10.0.0.124:22-10.0.0.1:35808.service: Deactivated successfully. Mar 20 18:05:23.609750 systemd[1]: session-17.scope: Deactivated successfully. Mar 20 18:05:23.611458 systemd-logind[1485]: Session 17 logged out. Waiting for processes to exit. Mar 20 18:05:23.612680 systemd[1]: Started sshd@17-10.0.0.124:22-10.0.0.1:35824.service - OpenSSH per-connection server daemon (10.0.0.1:35824). Mar 20 18:05:23.613698 systemd-logind[1485]: Removed session 17. Mar 20 18:05:23.667040 sshd[5140]: Accepted publickey for core from 10.0.0.1 port 35824 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:23.668862 sshd-session[5140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:23.673995 systemd-logind[1485]: New session 18 of user core. Mar 20 18:05:23.683485 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 20 18:05:23.944316 sshd[5143]: Connection closed by 10.0.0.1 port 35824 Mar 20 18:05:23.944703 sshd-session[5140]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:23.953548 systemd[1]: sshd@17-10.0.0.124:22-10.0.0.1:35824.service: Deactivated successfully. Mar 20 18:05:23.955836 systemd[1]: session-18.scope: Deactivated successfully. Mar 20 18:05:23.956704 systemd-logind[1485]: Session 18 logged out. Waiting for processes to exit. Mar 20 18:05:23.959683 systemd[1]: Started sshd@18-10.0.0.124:22-10.0.0.1:35828.service - OpenSSH per-connection server daemon (10.0.0.1:35828). Mar 20 18:05:23.961046 systemd-logind[1485]: Removed session 18. Mar 20 18:05:24.020703 sshd[5153]: Accepted publickey for core from 10.0.0.1 port 35828 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:24.022386 sshd-session[5153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:24.027321 systemd-logind[1485]: New session 19 of user core. Mar 20 18:05:24.035425 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 20 18:05:25.657046 sshd[5156]: Connection closed by 10.0.0.1 port 35828 Mar 20 18:05:25.660100 sshd-session[5153]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:25.671787 systemd[1]: Started sshd@19-10.0.0.124:22-10.0.0.1:35830.service - OpenSSH per-connection server daemon (10.0.0.1:35830). Mar 20 18:05:25.672288 systemd[1]: sshd@18-10.0.0.124:22-10.0.0.1:35828.service: Deactivated successfully. Mar 20 18:05:25.675905 systemd[1]: session-19.scope: Deactivated successfully. Mar 20 18:05:25.676133 systemd[1]: session-19.scope: Consumed 653ms CPU time, 67M memory peak. Mar 20 18:05:25.678016 systemd-logind[1485]: Session 19 logged out. Waiting for processes to exit. Mar 20 18:05:25.680111 systemd-logind[1485]: Removed session 19. Mar 20 18:05:25.740480 sshd[5172]: Accepted publickey for core from 10.0.0.1 port 35830 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:25.741767 sshd-session[5172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:25.748283 systemd-logind[1485]: New session 20 of user core. Mar 20 18:05:25.756486 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 20 18:05:26.016939 sshd[5177]: Connection closed by 10.0.0.1 port 35830 Mar 20 18:05:26.017183 sshd-session[5172]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:26.028529 systemd[1]: sshd@19-10.0.0.124:22-10.0.0.1:35830.service: Deactivated successfully. Mar 20 18:05:26.030700 systemd[1]: session-20.scope: Deactivated successfully. Mar 20 18:05:26.032324 systemd-logind[1485]: Session 20 logged out. Waiting for processes to exit. Mar 20 18:05:26.034394 systemd[1]: Started sshd@20-10.0.0.124:22-10.0.0.1:35844.service - OpenSSH per-connection server daemon (10.0.0.1:35844). Mar 20 18:05:26.035377 systemd-logind[1485]: Removed session 20. Mar 20 18:05:26.080594 sshd[5188]: Accepted publickey for core from 10.0.0.1 port 35844 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:26.082783 sshd-session[5188]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:26.087834 systemd-logind[1485]: New session 21 of user core. Mar 20 18:05:26.094439 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 20 18:05:26.203109 sshd[5191]: Connection closed by 10.0.0.1 port 35844 Mar 20 18:05:26.203495 sshd-session[5188]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:26.208069 systemd[1]: sshd@20-10.0.0.124:22-10.0.0.1:35844.service: Deactivated successfully. Mar 20 18:05:26.210136 systemd[1]: session-21.scope: Deactivated successfully. Mar 20 18:05:26.210954 systemd-logind[1485]: Session 21 logged out. Waiting for processes to exit. Mar 20 18:05:26.211804 systemd-logind[1485]: Removed session 21. Mar 20 18:05:31.218246 systemd[1]: Started sshd@21-10.0.0.124:22-10.0.0.1:60866.service - OpenSSH per-connection server daemon (10.0.0.1:60866). Mar 20 18:05:31.273159 sshd[5209]: Accepted publickey for core from 10.0.0.1 port 60866 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:31.276134 sshd-session[5209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:31.292601 systemd-logind[1485]: New session 22 of user core. Mar 20 18:05:31.298049 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 20 18:05:31.416677 sshd[5211]: Connection closed by 10.0.0.1 port 60866 Mar 20 18:05:31.416994 sshd-session[5209]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:31.421201 systemd[1]: sshd@21-10.0.0.124:22-10.0.0.1:60866.service: Deactivated successfully. Mar 20 18:05:31.423324 systemd[1]: session-22.scope: Deactivated successfully. Mar 20 18:05:31.424016 systemd-logind[1485]: Session 22 logged out. Waiting for processes to exit. Mar 20 18:05:31.424974 systemd-logind[1485]: Removed session 22. Mar 20 18:05:36.430563 systemd[1]: Started sshd@22-10.0.0.124:22-10.0.0.1:60868.service - OpenSSH per-connection server daemon (10.0.0.1:60868). Mar 20 18:05:36.481707 sshd[5225]: Accepted publickey for core from 10.0.0.1 port 60868 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:36.483255 sshd-session[5225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:36.487943 systemd-logind[1485]: New session 23 of user core. Mar 20 18:05:36.499562 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 20 18:05:36.613450 sshd[5227]: Connection closed by 10.0.0.1 port 60868 Mar 20 18:05:36.613905 sshd-session[5225]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:36.618406 systemd[1]: sshd@22-10.0.0.124:22-10.0.0.1:60868.service: Deactivated successfully. Mar 20 18:05:36.620623 systemd[1]: session-23.scope: Deactivated successfully. Mar 20 18:05:36.621483 systemd-logind[1485]: Session 23 logged out. Waiting for processes to exit. Mar 20 18:05:36.622356 systemd-logind[1485]: Removed session 23. Mar 20 18:05:38.805739 containerd[1505]: time="2025-03-20T18:05:38.805697305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"520a1815491896bcbae33aba75f443ac2850eff587f9d686b14b72d74b77e2ea\" id:\"698c3fa081406e32f10e6fcda788bf104f1afaff6dc9426318fafb31ffda6287\" pid:5256 exited_at:{seconds:1742493938 nanos:805468838}" Mar 20 18:05:41.627262 systemd[1]: Started sshd@23-10.0.0.124:22-10.0.0.1:55390.service - OpenSSH per-connection server daemon (10.0.0.1:55390). Mar 20 18:05:41.684084 sshd[5273]: Accepted publickey for core from 10.0.0.1 port 55390 ssh2: RSA SHA256:liWVzKPSx8/oOaQKm9GrH4XFs0w+uW+AX6jAY0XmCvg Mar 20 18:05:41.685614 sshd-session[5273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 18:05:41.690254 systemd-logind[1485]: New session 24 of user core. Mar 20 18:05:41.696504 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 20 18:05:41.808730 sshd[5275]: Connection closed by 10.0.0.1 port 55390 Mar 20 18:05:41.809050 sshd-session[5273]: pam_unix(sshd:session): session closed for user core Mar 20 18:05:41.813398 systemd[1]: sshd@23-10.0.0.124:22-10.0.0.1:55390.service: Deactivated successfully. Mar 20 18:05:41.815601 systemd[1]: session-24.scope: Deactivated successfully. Mar 20 18:05:41.816272 systemd-logind[1485]: Session 24 logged out. Waiting for processes to exit. Mar 20 18:05:41.817202 systemd-logind[1485]: Removed session 24.